- Quick Read
- Deep Read ( 9 Min. )
Our name is about honesty. The Monitor is owned by The Christian Science Church, and we’ve always been transparent about that.
The Church publishes the Monitor because it sees good journalism as vital to progress in the world. Since 1908, we’ve aimed “to injure no man, but to bless all mankind,” as our founder, Mary Baker Eddy, put it.
Here, you’ll find award-winning journalism not driven by commercial influences – a news organization that takes seriously its mission to uplift the world by seeking solutions and finding reasons for credible hope.
Explore values journalism About usWhen one becomes a target for online harassment, there are usually only two options: to suffer the slings and arrows or to uproot from the internet – and sometimes one’s real life – entirely.
But Jessikka Aro, a Finnish investigative journalist, found an elusive third option: justice.
Ms. Aro compiled award-winning reports on Russian online propaganda efforts, particularly via so-called “online troll factories” and pro-Russia European media. But her work drew the attention of pro-Russian trolls, who began a hate campaign against Aro. She was harassed via social media; she received death threats; and her personal medical records were published online.
But Aro took the harassers she could identify to court. She sued Johan Bäckman, a pro-Russian Finnish academic, and Ilja Janitskin, publisher of the pro-Kremlin website MV-Lehti, over claims including stalking and defamation. The Finnish prosecutor added criminal charges too.
On Thursday, Aro won her case. The judge found Mr. Janitskin guilty on 16 counts, including defamation, and sentenced him to 22 months in prison. Mr. Backman was also found guilty and received a one-year suspended sentence. The two men, along with a third defendant, were ordered to pay 136,000 euros ($157,000) in damages.
“This is massive,” tweeted Ben Nimmo, an Atlantic Council senior fellow who studies disinformation. “Slowly, the law catches up with trolling.”
Now for our five stories of the day.
Link copied.
Already a subscriber? Login
Monitor journalism changes lives because we open that too-small box that most people think they live in. We believe news can and should expand a sense of identity and possibility beyond narrow conventional expectations.
Our work isn't possible without your support.
Parallels to the 1850s abound, but the US is not heading for a second Civil War, historians say. However, “then” was an extreme version of “now,” and the results of its extremity may hold lessons for today.
Resentment and anger. Racial, religious, and cultural animosity. Neat party division. Those things existed in America in abundance in 1850 as well as 2018. There has been lots of apocalyptic second Civil War punditry in recent months – enough to spark a backlash, with articles like “Stop Making Second American Civil War Clickbait.” But in the 1850s, provocative action begat more provocative action, creating and feeding a whirlwind that ended in fighting. National politics became so dysfunctional that it broke down the public consensus that underlay republican governance. Modern America is not in danger of falling into a second Civil War, says Jason Phillips, author of the new book “Looming Civil War.” That said, one of the lessons of the past might be that the polarized factions of politics should pay more careful attention to what the other has to say. Many Northerners didn’t believe Southern threats of secession if Abraham Lincoln were elected president. Southerners scoffed at Northern vows to fight for the Union. “We are better off if we just try to listen to each other and take each other seriously,” Professor Phillips says.
A nation divided into groups of angry, polarized voters.
Political parties splintering under the stress of social and ideological disagreements.
Distrust in institutions. Constant partisan accusations. Widespread conspiracy theories about the perfidy of the other side.
Powerful new communication networks that spread news of all this throughout the United States.
Is this a staccato description of the state of America today? Yes it is. But just as much, it’s a sketch portrait of the 1840s and 1850s, the era of national upheaval prior to the explosion of the Civil War.
These two periods aren’t exact analogies, of course. Chattel slavery was an evil and a means of division rarely matched in history. Nineteenth century America was steeped in personal and political violence; national democratic government then was relatively young and unformed, trying to find its way.
But “then” was an extreme version of “now,” and the results of its extremity may hold lessons for today. In the 1850s provocative action begat more provocative action, creating and then feeding a whirlwind that ended in fighting. National politics became so dysfunctional it broke down the public consensus that underlay republican governance.
Modern America is not in danger of falling into a second Civil War, says Jason Phillips, Eberly Family Distinguished Professor of Civil War Studies at West Virginia University, and author of the new book “Looming Civil War: How Nineteenth-Century Americans Imagined the Future.”
It’s too simplistic to overlay 2018 on 1858 and say that the patterns of today’s events predict a similar outcome, Dr. Phillips says.
That said, one of the lessons of that past conflict might be that the polarized factions of politics should take each other’s words more seriously. Many Northerners didn’t believe Southern threats of secession if Abraham Lincoln was elected president. Southerners scoffed at Northern vows to fight for Union.
“We are better off if we just try to listen to each other and take each other seriously,” Phillips says.
Before turning the clock back 170 years to examine similarities and differences between modern America and the antebellum era, it might be useful to look at the nature of the current division in the US electorate. That’ll give a baseline from which to proceed and compare citizens now and citizens then.
America today is not split neatly into Northern and Southern factions. There are Democrats in the cities of Texas, Florida, and Georgia. There are Republicans in the hardscrabble areas of rural New England and Maryland’s Eastern Shore.
But there is, increasingly, an “us” and “them.” The election of President Trump did not start this pattern of polarization. It’s been slowly developing for decades. But the Trump era has electrified the barrier of party division. His supporters love what they consider his blunt talk about gender, ethnicity, and racial conflict. His opponents hate it. The result is a spiral of angry words.
Political polarization is at an all-time high. Ironically, it is only partly about politics in the traditional sense. There are issues that still divide Republicans and Democrats – abortion, guns – but policy outcomes aren’t always the main point.
Even when the parties agree on what to do about a particular national problem, they view each other suspiciously and put winning as a group over all else, writes Lilliana Mason, an assistant professor of government and politics at the University of Maryland, in her recently published book “Uncivil Agreement: How Politics Became Our Identity.”
In large part this is due to the fact that the parties have become increasingly socially homogeneous, according to Dr. Mason. More and more, Republicans are white, often Christian, and in the Trump era, predominantly male. Democrats are the party of minorities and many whites with higher levels of education.
The parties are more ideologically homogeneous as well. Conservative Southern whites used to be Democrats. As their party became more liberal and pushed major civil rights legislation in the 1960s, those voters drifted toward the Republicans. That switch is now largely complete. Liberal Republicans aren’t extinct, but can see extinction from where they’re standing. Like a rare species of woodpecker, they are now mostly limited to habitat found in small areas in the Northeast.
The bottom line: Social identity is now at the heart of the two big parties that govern America. They’re split along racial, religious, and cultural lines. The crosscutting social ties that once promoted partisan understanding have withered. Democrats and Republicans live in different neighborhoods, send their kids to different schools, attend different churches, and increasingly inhabit their own news and information bubbles.
Social identity is powerful. It can prize winning for “us” above specific policy results. In that sense, candidate Donald Trump was a good fit for a new age in 2016. He vowed to win until his supporters were sick of winning – without many specifics of how, or in what.
“In this political environment, a candidate who picks up the banner of ‘us versus them’ and ‘winning versus losing’ is almost guaranteed to tap into a current of resentment and anger across racial, religious, and cultural lines, which have recently divided neatly by party,” according to Professor Mason.
Resentment and anger. Racial, religious, and cultural animosity. Neat party division. Those things existed in America in abundance in 1850 as well as 2018.
It’s a comparison that’s occurred to lots of political writers. There has been lots of apocalyptic Second Civil War punditry in recent months. Among the most notable: columnist Thomas Friedman in The New York Times (“The American Civil War, Part II”), veteran defense writer and author Thomas Ricks in Foreign Policy (“Will We Have a 2nd Civil War? You Tell Me”), and longtime foreign correspondent Robin Wright in The New Yorker (“Is America Headed for a New Kind of Civil War?”).
Such stories have been numerous enough to spark a backlash. “Stop Making Second American Civil War Clickbait” wrote Dylan Matthews in Vox in June.
Indeed, worries about America coming undone in some new kind of armed and dangerous civil conflict are unfounded, says Phillips, author of “Looming Civil War.” He bases his belief on one big difference between the eras: the way in which people imagine their future.
In the late 1850s, as conflict over slavery increased and the nation’s political atmosphere darkened, Americans began to expect armed conflict, says Phillips. Some dreaded it. Many others, on both sides, embraced it as inevitable and thought war might cleanse and shape the nation to desired ends. They rushed toward combat, physically and metaphorically. Author Louisa May Alcott expressed this view in an 1861 letter, writing that she “yearned for a battle like a warhorse when he smells powder.”
Today we know better. War is a terrible inferno that builds on itself. The anticipatory feeling of the years before Fort Sumter seems naïve.
“We don’t have the same view they had, that the war was something you could control,” says Phillips. “Nowadays you can start wars but they’re not easy to stop. Wars don’t end problems.”
Still, there are political parallels between the eras, Phillips says. The first and most obvious is that the emotional partisanship visible in Washington today (see “division of electorate,” above) was also evident in the 1850s.
The division then was largely between North and South, and the divisive element was slavery. The North feared a despotic “Slave Power” – a sectional conspiracy they felt controlled the national government and was plotting to expand slavery into new territories. The South feared the North wanted to flex its industrial might and larger population to destroy its economic system and way of life.
When one side did something to defend its position, the other saw it as a provocation, and made a defensive move viewed as a provocation in turn. The result was a spiral of outrage, not unlike the situation in Washington today, where partisanship produces legislative gridlock.
“These are the places our politics can go if we get to a point where it is no longer possible to compromise,” says Julie Novkov, professor of political science and women’s studies at the State University of New York at Albany, and contributor to the political history blog “A House Divided.”
In the 1850s, this stress ripped the national political parties apart. Whigs, defined by their opposition to Andrew Jackson’s populist Democrats, fell apart. Democrats split between Northern and Southern factions. These elements recombined, with Democrats on one side and the new Republican Party on the other. The South called the new organization “Black Republicans.” Partisan anger became so intense that physical combat broke out repeatedly in congressional chambers.
The new book “The Field of Blood: Violence in Congress and the Road to Civil War,” by Yale historian Joanne Freeman, is a fascinating look at the details of this conflict and its obvious echoes today. Dr. Freeman documents more than 70 violent clashes between lawmakers in the House or Senate, or nearby, in the years between 1830 and 1860.
Southern lawmakers were at first the drivers of the violence. Dueling was still accepted and even common in the South, and in Washington Southern “bullies” – a word they used themselves – taunted and called out Northern counterparts. Such tactics helped enforce gag rules prohibiting floor discussion of antislavery petitions.
Most Northerners were “noncombatants.” That meant they rejected dueling and the logic of the so-called code of honor that lay behind it as illegal and immoral. At least, they were noncombatants at first. Years of abuse and humiliation pushed many into becoming fighters. They armed themselves against attacks in chambers or on the street. And their constituents approved. Some even sent guns for self-defense to their elected representatives.
Meanwhile, the press heightened the tension. Newspapers in both North and South pushed conspiracy theories about alleged nefarious plots to grab power on the other side. And the new technology of the telegraph gave editors added power. Accusations could quickly blanket the nation. No longer could politicians muddle through with ambiguous positions: The telegraph ended the practice of lawmakers saying one thing to one type of audience, and another thing to another group, secure in the knowledge that neither would hear what had been said to the other.
Things came to a head in February 1858. A hothead Southern congressman and a Northern colleague got into a fistfight during a nighttime debate on the incendiary issue of the slavery status of Kansas. Southerners raced in a group to defend their man – and Northerners, some armed, leaped over desks and vaulted down the aisles to return blow for blow. It was a true melee in front of the Speaker’s platform, a sectional conflict that presaged the war to come.
History does not repeat but it does teach, to steal a phrase. Dr. Freeman’s “Field of Blood” shows what can happen when extremely polarized leaders fight over what kind of nation the United States will become in an atmosphere of distrust, threatened violence, and press manipulation. Voters may now hold Congress in low esteem, but what it does, or doesn’t do, matters.
“The lessons of their time ring true today: when trust in the People’s Branch shatters, part of the national ‘we’ falls away,” Freeman concludes.
On the scale of possible futures for the United States, a second Civil War seems an extreme outlier, of course. But there are many other possible outcomes on the conflict spectrum, from scattered fights at political rallies and demonstrations to lone wolf attacks to the organization of quasi-official guard or militia groups. The problem is that discourse has broken down and furor is rushing in to take its place.
Making politics less about social identity and more about self-interested policy choice could go a long way toward lowering the national temperature. Can that be done? Yes, writes the University of Maryland’s Mason, in the sense that social science shows there are some methods that appear to be effective in lessening group conflicts.
One such method is simply increasing contacts. That can reduce prejudice between groups. Nongovernmental organizations could organize occasions for overt cross-partisan socializing. Entertainment media could help by adding more sympathetic partisans of both sides to popular shows.
Political leaders could set and enforce norms for more civil behavior. If they truly want to reduce partisanship, they could simply talk about their opponents in a consistently respectful and unprejudiced way.
The parties could also unite to solve an overarching national goal. The problem here is, what? Absent an alien invasion requiring Americans to pull together, what goal would suffice? Some experts suggest climate change, but that would require Republicans to change position and favor more active intervention on the issue.
Finally, there’s always the possibility of a natural unsorting. Politics is not forever. In the future, demographic groups could switch allegiances for some reason, or rise and fall in power, or mix themselves between the two parties in some way. After all, it hasn’t been long since white unionized factory workers were reliable Democrats. That kind of change could happen again.
Absent that, the current homogenization of the parties might be hard to surmount. “As long as a social divide is maintained between the parties, the electorate will behave more like a pair of warring tribes than like the people of a single nation, caring for their shared future,” concludes Mason in “Uncivil Agreement.”
Check out previous installments of the Democracy Under Strain series.
In the closest US governor’s race, Democrat Stacey Abrams is hoping Southern Millennials' shift in focus from social to economic issues can help propel her to become the nation's first black woman governor.
Stacey Abrams, former Georgia House minority leader and a self-described “nerd,” is on a mission to capture the governor’s mansion. With 1 point separating her and her opponent, Ms. Abrams could become the first black woman in America to lead a state. “Leadership is combining vision with good government while also carving out new spaces for progress,” Abrams said in an interview. She believes that message is unifying in a “moment of tumult.” To win, political observers say, Abrams has to walk a tightrope: excite disgruntled Millennials by offering transformational change while offering competent leadership for all. How that plays out among women voters in the suburbs will be key. Millennial mom Tiffany Cowley finds an attack ad calling Abrams a socialist “worse than Nancy Pelosi” preposterous. “This election has made me realize that I am a lot more progressive than I ever thought I was,” she says. But Abrams is also attempting to appeal to those unlikely to vote for her, touring all 159 counties. In Glenwood, a dying timber town as desperate as they come for revival, Jacob Raiford says he is concerned that a liberal mind-set will melt away values that make America great. “You know what? It ain’t worth it if we get Obama Jr. in return,” he says.
On a wall on the 16th floor of the Hurt Building in Atlanta, next to a colorful “Shirley Chisholm for President ‘72” poster, guests to the Undivided tech incubator are asked to jot down their dreams.
Stacey Abrams took a second the other day to think, then quickly wrote: “To be governor of Georgia & own a transporter like in ‘Star Trek’.”
Just a few years ago in this rose-red state, both of those would have been seen as highly illogical.
But now the first one has drawn surprisingly close to reality for Ms. Abrams, the former Georgia house minority leader and a self-described “nerd” on a mission to capture the governor’s mansion. With one point separating her and Georgia Secretary of State Brian Kemp, Abrams could very well become not only the first black governor of Georgia, but the first black woman in America to ever lead a state.
“Leadership is combining vision with good government while also carving out new spaces for progress,” she said in an interview with the Monitor. She says she believes that message is unifying in a “moment of tumult.”
The razor-tight race has been accompanied by record early voting, national scrutiny, and heated controversy. After an Associated Press analysis found that 53,000 voter registrations had been suspended, about 70 percent of them African-American, Abrams and other Democrats called for Mr. Kemp to resign as the state’s top election official to avoid an appearance of conflict of interest. Kemp, for his part, has called the allegations of voter suppression a politically manufactured story.
All week, people have been waiting up to three hours in line to vote – with some 296,000 Georgians already having cast their ballots by Thursday. On Monday, the first day of early voting, about 129,500 people had voted either in person or by mail, compared with just over 46,000 four years ago. The Trump-anointed Kemp was leading Abrams 47 to 46 percent, with only 4 percent of likely voters still undecided, according to a poll out Wednesday from Reuters/Ipsos/University of Virginia Center for Politics. That’s well within the margin of error.
But in the past, calling for the razing of the Confederate memorial at Stone Mountain, as Abrams has done, would have disqualified any gubernatorial candidate – as would proposing an assault-style weapons ban. Yet the close race suggests what Emory University political scientist Andra Gillespie says is the emergence of a cross-generational and cross-racial moment that could transform the face of the Deep South state.
To win, political observers say, Abrams has to walk a tightrope: Excite disgruntled Millennials by offering transformational change while offering competent leadership for all. Abrams has pointed to her history of bipartisan compromise while in the state house.
How that plays out among women voters in the suburbs will be key. But so will her outreach to rural Georgia, where she may not gain many votes, but where she is vying at least for a measure of acceptance.
“Stacey Abrams is running a new kind of campaign,” says Georgia State University political scientist Daniel Franklin, author of “Pitiful Giants: Presidents in their final terms.” Unlike the big-name Clintonesque centrists who have lost – badly – here, “she is running from the left. She ain’t fooling around. She is trying a new model, a turnout model.”
That said, Professor Franklin says, Abrams’s “problem is that, if most of us are either donkeys or elephants, there aren’t enough donkeys in Georgia” to form a political majority.
Without a lot of political role models, Abrams is her own unique self, a self-described “Star Trek nerd” but also a Yale Law School graduate equally at ease crunching tax code as churning out pages for her latest romance novel, of which she has written eight. Her ninth book is her memoir, “Minority Leader.” The résumé sounds “like a reality-TV show,” she quips.
As Avondale High School valedictorian, she was once denied entry to celebration of the state’s academic stars at the governor’s mansion. The high-schooler had arrived on a city bus. That incident made a deep impression on her and the story has made her campaign deeply evocative for many Georgians who feel forgotten by the state’s rock-solid Republican majority.
But she has said she is not bitter. As minority leader, she collaborated with Republican colleagues to reshape a popular college scholarship and partnered with Republican Gov. Nathan Deal to reform the state’s criminal justice system.
“I am willing to work across the aisle, but not without carving out a slice,” she says.
With only 4 percent of Georgians undecided, there is little wiggle room for either candidate.
Abrams needs usually apathetic Democrats, including black voters, to come out in droves. She is banking on a re-run of the Doug Jones Senate win in Alabama last year, where black women made a statement by defeating controversial former Judge Roy Moore, who had been accused of sexual misconduct with teenage girls.
Kemp, for his part, needs the Trump base to turn out.
The campaign will be fought and won in the suburbs, where Trump remains unpopular, says Brent Buchanan, a GOP pollster out of Montgomery, Ala.
Last week, both candidates turned out national figures for fundraising events. Donald Trump, Jr., attracted about 50 people to a $50 a plate fundraiser for Kemp. Nearly 400 showed up to Abrams’s free event in suburban Morrow, where she was flanked by Sen. Elizabeth Warren (D) of Massachusetts and Ayanna Pressley, who won a stunning upset against a 10-term incumbent in the Massachusetts primary.
As supporters wore T-shirts with slogans like “Eat Grits, Be Happy,” and “Larry David for President,” Ms. Pressley warmed up the rally, exhorting them to urge their friends and family to vote: “If I’m speaking to the choir, well, I need you to sing!”
Tiffany Cowley stood listening nearby. A self-described “Millennial mom” from the suburbs, this is her first political event. Her issues are education and gun safety. What she saw as the naked insults of the Brett Kavanaugh nomination are also swirling in her mind as she prepares to vote for Abrams.
She has noticed a phenomenon of white women, including friends, “putting their husbands on a pedestal.” Both her parents owned businesses and worked hard. “Just like they did, I now co-parent with my husband, a partnership of equals.”
Attack ads on Abrams as a socialist “worse than Nancy Pelosi” who wants to let sexual predators onto playgrounds ring preposterous to Cowley.
“This election has made me realize that I am a lot more progressive than I ever thought I was,” she says.
Such personal realizations among educated suburban women may, in fact, drive the outcome.
“What realigned the American South over the last half century does not resonate as strongly with [Southern Millennials]” – issues like “social issues, gun issues, fears of the browning of America, fears of Muslims,” says James LaPlant, who studies Southern politics as a dean at Valdosta State University. “What is resonating are campaigns that emphasize economic issues, what the future portends for you, how to deal with large amounts of student debt.”
Andre Dickens, an Atlanta city councilor, says Abrams’s road to victory runs through Atlanta, her suburbs, and the Democratic cores of cities like Augusta, Albany, Macon, and Savannah.
On the other hand, Mr. Buchanan, the GOP pollster, says his firm doesn’t even bother to poll rural parts of South Georgia because, well, Abrams “doesn’t have a chance to pick off rural Republican voters in Georgia.”
But she has attempted to appeal to those unlikely to vote for her, touring all 159 counties, playing up her rural roots in places like Wheeler County. In 2012, she notes, she won the livestock competition at the annual legislative games with a 1,000-pound heifer named Bessie.
Abrams has a bevy of proposals that could aid rural Georgia: 22,000 state-funded apprenticeships, Medicaid expansion to help devastated rural health care, needs-based student loan repayment. The bottom line message to conservative voters, says supporter Audrey Gibbons, is that “Stacey Abrams is more than qualified to govern Georgia.”
To many, she says, Abrams’s candidacy asks fundamental questions about citizenship and belonging.
“Conservatives would be disappointed if she wins like they were disappointed when we elected the skinny man with the funny name,” says Ms. Gibbons, the Democratic county chair in Glynn County in a phone interview. “Abrams was not born in Georgia, but she came to Georgia and made her home, became educated, did everything we are taught to do, believes in Jesus Christ. Then the question becomes: Why should we hold up because our views don’t equal up to your antique views?”
Jacob Raiford says he has three clues to the answer.
He lays them out one by one on the counter at his Glenwood, Ga., package store: a large .45 magnum revolver, a snub-nosed smaller caliber, and a shotgun.
The gesture of displaying guns to a stranger seems menacing – but only for a moment.
Guard down and guns stowed away, Mr. Raiford turns out to be a talkative 29-year-old who does not identify as a “coddled” Millennial. “I got the belt if I misbehaved, and I agree with that,” he explains.
He has worked as a prison guard, owned a restaurant by 20, and played for a championship basketball team.
“I was middle class once, but the middle class is gone,” he says. He blames the Obama administration and what he sees as the infusion of laissez-faire living through government support. He also speaks openly about his fear of Muslims – despite the fact that in 2014 fewer than 1 percent of Georgians identified as Muslim, according to the Pew Research Center. “Muslims in prison means that the mess can’t serve bacon. They are trying to turn us into them,” he says. Abrams is a practicing Christian.
Raiford says he is concerned that a liberal mindset will melt away values that make America great – even if Glenwood continues to struggle under President Trump – a dying timber town as desperate as they come for revival.
The hospital closed in 2012. “It killed this town,” he says matter-of-factly. Abrams wants to take a federal Medicaid expansion that could turn the lights on at the Lower Oconee and other shuttered or struggling rural clinics.
“But you know what? It ain’t worth it if we get Obama Jr. in return,” he says.
Raiford insists his opposition has nothing to do with race, noting that 90 percent of his clientele are African-Americans, many of them loggers, truck drivers, and prison guards.
Nevertheless, Raiford’s admission that he’d rather see a town suffer than watch a black woman become governor reflects “a long, long history of racial division in this state and the role of race in [the state’s political] realignment,” says Professor LaPlant. “That still plays a powerful role in Trump country down here, or in Kemp country.”
In her insistence on visiting areas where she has little chance of changing minds, Abrams, political scientists say, is explicitly ignoring a liberal canard: that people like Raiburn are morally suspect because they are willing to undercut the economic vitality of their own communities in defense of “heritage.”
There is an assumption that “people should focus on economic benefits to themselves when they vote, and if they don’t they are not voting correctly,” says University of Georgia political scientist Jeffrey Glas, author of a 2016 Presidential Studies Quarterly article titled “There is Nothing ‘Wrong’ with Kansas.”
“Most voters just know the Democrat versus Republican part, but they are not voting on that stuff. They are thinking about the one or two issues they really do care about, which could be social issues, identity groups, things like that. That is why it becomes a dangerous little area to talk about correct and incorrect voting.”
In that way, Abrams’s bid to govern Georgia is a test of something more profound, says Mr. Glas.
He notes that the vast majority of Americans are more practical than ideological. Because of that fact, he says, “I think that we all in our hearts want to get past [raw partisanship]. It’s just that our heads aren’t quite ready to do that.”
But as a gauge of that goal, adds LaPlant, an Abrams victory in November “would be an earthquake.”
When it comes to cutting down on teens smoking, activists found they can do more by going small. Instead of working at the national level, they’re focusing on towns and cities – and seeing big results.
Across the United States, towns, cities, and counties are passing “Tobacco 21” laws banning the sale of tobacco products to those under the age of 21. These come in the wake of several failed state-level efforts to raise the smoking age. The turning point was in 2005, when the town of Needham, Mass., successfully passed Tobacco 21 legislation. Today, six states and more than 350 municipalities have passed similar laws, according to the Campaign for Tobacco-Free Kids. “We all realized that working locally was the secret sauce,” says Rob Crane, a family doctor and founder and president of the Preventing Tobacco Addiction Foundation. “There are no tobacco lobbyists at the local level, just parents, doctors, teachers, and neighbors serving on local city councils.” And in turn, success at the local level has re-energized state-level antismoking advocacy. “Taking local action in your community is definitely the way to get things moving in your home state,” says Jonathan Winickoff, a practicing general pediatrician and professor at MassGeneral Hospital for Children. “[I]t is good to keep in mind that the majority of all political parties, age groups, and even smokers themselves support these laws.”
When the Greater Kansas City Chamber of Commerce was convening experts and community leaders around a health initiative, the idea of raising the smoking age from 18 to 21 kept coming up.
In fall 2015, the chamber declared that it would work to advocate for a higher smoking age throughout the region. Within a mere two weeks, legislators in both Kansas City, Mo., and Kansas City, Kan., passed ordinances in November 2015.
“The amount of interest and the momentum behind this idea surprised even us,” says Scott Hall, senior vice president for civic and community initiatives, in a phone interview. “We have exceeded our wildest dreams about how we would be able to collaborate with our local communities to make this happen.”
From there, the idea spread like wildfire. Today, some 28 communities in the greater Kansas City region have raised the smoking age to 21, and that number continues to grow, Mr. Hall says.
The scenario Hall describes is becoming more common across the United States, as towns, cities, and counties pass “Tobacco 21” laws banning the sale of tobacco products to those under the age of 21.
“We have now seen six states and over 350 municipalities adopting Tobacco 21 laws,” says John Schachter, director of state communications for the Campaign for Tobacco-Free Kids, in a phone interview. “We saw a lot of municipalities at the local level, especially in Massachusetts, start this trend.”
According to data from Mr. Schachter’s organization, the policy shift has taken place in communities ranging from large cities like Boston, Chicago, and New York to small towns with populations in the thousands or even hundreds. And he sees the legislation as key to addressing tobacco use among young adults.
“Since tobacco is so harmful, we should do everything we can to prevent tobacco use, especially among young people,” he says, noting research findings that some 95 percent of adult smokers started before age 21. “Increasing the legal age of sale to age 21 will surely help.”
Rob Crane, a family doctor and professor of family medicine at The Ohio State University, is founder and president of the Preventing Tobacco Addiction Foundation and Tobacco 21. Mr. Crane recalls his organization assisting with several failed state-level efforts to raise the smoking age.
The tide changed in 2005, he says, when the town of Needham, Mass., successfully passed Tobacco 21 legislation.
“We all realized that working locally was the secret sauce,” says Crane in an email interview. “There are no tobacco lobbyists at the local level, just parents, doctors, teachers and neighbors serving on local city councils.”
Both Crane and Schachter agree that success at the local level has fueled movement at the state level.
“Once we achieve enough local ordinances, there is then pressure in the state legislature to act,” says Crane, who predicts that by fall 2019, some 40 percent of the country will be blanketed with Tobacco 21 policies.
This growing momentum behind Tobacco 21 legislation at the local level comes as the US Food and Drug Administration is mounting an expansive campaign of enforcement around the marketing and selling of e-cigarettes to teenagers, an issue of concern for public health officials as the popularity of vaping grows in that age category.
Jonathan Winickoff, a practicing general pediatrician and professor at MassGeneral Hospital for Children, focuses his research on strategies to address tobacco use.
“Taking local action in your community is definitely the way to get things moving in your home state ... [and] it is good to keep in mind that the majority of all political parties, age groups, and even smokers themselves support these laws,” says Dr. Winickoff in an email interview. “All these successes have occurred in just the past five years, so the momentum is ... continuing to build.”
Hall says that having data about a reduction in tobacco use among youth in Needham after legislation was passed there helped fuel a snowball effect with other communities.
“Other cities began to realize the incredible impact that this could have,” he says. “We were able to articulate to our members – and to our partners in the various municipalities and counties around our metro – that if you did this, it would really work.”
A report detailing sexual abuse at a girls school founded by an American woman has spurred soul-searching in a country where debates about foreign assistance run deep.
“More Than Me,” a charity founded by a young American woman, operates more than a dozen public schools in Liberia. Last week an investigative news report detailed a More Than Me staffer’s abuse of girls and the organization’s response, and since then the capital city has seen an outpouring of grief and anger. For some the scandal is a warning against the often unchecked power of foreign charities in Africa. “The Liberian government’s neglect of the education system has made it vulnerable and made communities dependent on outside charities to educate their kids,” says Lakshmi Moore, whose nonprofit organized a march through the streets of Monrovia Thursday to call for accountability. At times the outrage has been mixed with feelings of guilt and vulnerability over the fact that the girls needed a charity’s help to begin with and that their government had not protected them. Activists have called for better oversight as well as better services for victims of gender-based violence. But for many people in the communities such schools serve, things are still more complicated. Outsiders “can say [close the school] because they have money,” says the mother of one student. “Why should they ask for the school to be closed? [I have] no job, nobody to help me.”
She was an idealistic American with a vision that everyone, no matter where they came from, deserved a good education. They were girls from a poor Liberian neighborhood who dreamed of a way out of their difficult circumstances.
When Katie Meyler opened the tuition-free “More Than Me” school for girls in a smart green and white house in the center of Liberia’s capital in 2013, it seemed to many both here and abroad a perfect match.
“More Than Me is designed specifically to help young women choose the lives they wish to lead,” explained a 2015 profile of Ms. Meyler in The Christian Science Monitor.
“We address every barrier that a girl faces,” Meyler told the Monitor at the time. “We are relentlessly fighting for our girls.”
But beneath the upbeat story that More than Me told about itself – and that news outlets like ours told about More Than Me – was a darker story. Over a period of several years, one of the charity’s key founding staff members, Macintosh Johnson, allegedly had raped or assaulted as many as dozens of girls in its care. Meanwhile, Meyler and the nongovernmental organization had distanced themselves from the scandal, going on to take over 18 schools across Liberia and draw accolades from the world’s most influential philanthropists, according to an exposé published last week by Time magazine and the investigative news nonprofit ProPublica.
In the days since the story broke, the revelations have inspired an outpouring of grief and anger here – a kind of societal soul-searching that has filled the airwaves and the headlines, the subject of public transportation chatter between strangers and hushed family conversations over dinner.
Newspapers hawkers bobbed through traffic Wednesday in the capital clutching papers with headlines that read “Petition to shut down More Than Me swarms social media” and “Online campaign seeks justice for victims of sexual abuse at More Than Me,” signs of the growing tides of public outrage.
“As a mother I very angry because the school did nothing to protect the children in the first place, and that white woman was just using the children to get rich,” announced one woman in a shared taxi heading for the city center.
“But what were the mothers doing?” the taxi’s driver shot back. “These days mothers care less about their daughters.”
Where to put the blame for what happened, indeed, remains a major question here. For some, the scandal is a warning against the often unchecked power of foreign charities in Africa. It “reveals our warped tendencies to glorify foreigners for swooping into poor countries under the guise of doing good,” wrote the political commentator Robtel Neajai Pailey in Al Jazeera this week.
But for others, what happened to the girls at More than Me is a reminder, more broadly, of the many dangers that still come with being young, poor, and female in Liberia today. For many Liberians, their reaction to the article was a mix of anger and guilt – that these young Liberian women had been mistreated by a foreign charity, yes, but also that they’d needed that charity’s help to begin with, and that the government hadn’t protected them from the abuse either.
“We are getting to the point where we have to wear our outrage on the surface,” says Lakshmi Moore, a member of the Liberia Feminist Forum, which organized a march of about 500 people through the streets of central Monrovia Thursday to call for accountability from both the government and the charity. “Women’s issues aren’t being taken seriously in this country. But if one good thing could come of this tragedy, it would be if it gives us the momentum to really change things here.”
Johnson was arrested and suspended from the school in June 2014, after a few students confided in staff, and an administrator reported him to police. A first trial ended with a hung jury. He had AIDS, according to the ProPublica report, and died in 2016 while awaiting a second trial.
On Tuesday, the country’s information ministry promised that it would carry out a “meticulous” investigation into what happened at More Than Me. The charity’s Liberian Advisory Board, meanwhile, has asked Meyler to temporarily step down while it conducts its own investigation, and she has.
“In reviewing the allegations as published by ProPublica and TIME we uncovered several statements that were either inconsistent with the information provided to us by More Than Me leadership or that were new information,” the board said in a statement.
Activists for women and girls here have called for better oversight for charities here, as well as better services for victims of gender-based violence.
“The More Than Me incident highlights the critical need for improving the justice system … to protect Liberian women and girls from violations in the first place,” wrote the organizers of Thursday’s march in a statement. “We therefore remind the Liberian government to use this as an opportunity ... to protect Liberian women and girls from these horrifying and inhumane experiences.”
But for many in the communities More than Me serves, things are more complicated.
Outsiders “can say [close the school] because they have money,” says Angeline Nyonnoh, the mother of a student at one of More Than Me’s schools who lives in West Point, a jumbled seaside settlement of shacks and small houses backing up into the Atlantic near central Monrovia. Most of the girls who attend More Than Me’s original school came from this area. “Why should they ask for the school to be closed?” Ms. Nyonnoh says. “[I have] no job, nobody to help me.”
Indeed, says Ms. Moore, the reason education charities have become so powerful in the country is that the country’s public education has long been among the worst in the world. When the Liberian Ministry of Education surveyed adult women who had attended school through the fifth grade, for instance, it found only 20 percent could correctly read a single sentence.
“In terms of people wanting to save Liberia, [Meyler] is not new,” she says. “The Liberian government’s neglect of the education system has made it vulnerable, and made communities dependent on outside charities to educate their kids.”
Two years ago, for instance, More Than Me was handed control of more than a dozen failing Liberian public schools. The move was part of a government-led experiment that turned a portion of the country’s public schools over to private organizations to be managed independently.
“We do need the [foreign] support” for our public schools, says Charles Wreh, a Monrovia resident and father. “We need that help.” But in the future, he added, “let the monitoring be more rigorous.”
Across Monrovia this week, indeed, that seemed to be a kind of collective prayer, and a collective apology. Let someone watch over them. Let someone protect our girls.
Reporters do get to out-of-the way places. Eating local foods can challenge not only their taste buds but their very concept of what counts as food. The adventure can be rewarding.
After a long, hot day of reporting and travel, a pair of veteran journalists, both devout foodies, were happy to sit down to dinner. The location was a boutique hotel in Agadez, Niger, on the edge of the Sahara Desert. On the menu that night was the chef’s specialty: camel stew. Just the idea might be off-putting for some, considering the camel’s status as a domesticated if famously ill-tempered pack animal. But the stew is, for lack of a better word, delicious. Auberge d’Azel chef Abdoulaye Mahamane acknowledges that his nomadic Tuareg tribe has depended upon the long-legged, loping beasts to sustain life in the harsh desert environment. “It’s an important animal for us; we have a special relationship,” he says. But that hasn’t stopped him from dishing up his signature dish. The stew has an earthy flavor, enhanced by ground peanuts and shriveled spinach greens and cooked to tender perfection with cumin, ginger, and saffron. Mr. Mahamane says there is one group of guests who can’t get enough of his stew. “It’s mostly Americans who always eat the camel,” he says.
Discerning taste buds in the West may be unfamiliar with camel meat. Perhaps what gives pause is the thought of converting ornery camels – the working end of long, grimy caravans that have crossed deserts for centuries – into delicacies of the table.
But that hasn’t stopped chef Abdoulaye Mahamane from dishing up his signature camel dish with peanut and spinach sauce. And from the kitchen of the boutique Auberge d’Azel hotel – a tiny, last bastion of civilization in Agadez, Niger, on the southern frontier of the sweltering Sahara Desert – he says there is one group of guests who can’t get enough.
“It’s mostly Americans who always eat the camel,” says Mr. Mahamane, the head of the kitchen, whose clients range from a University of Chicago crew of dinosaur hunters – in Agadez en route to launching a two-month desert expedition – to European Union military advisers, who passed through recently.
In recent days guests also included a pair of foreign journalists on the trail of a world-wide human migration story, both devout foodies and one of them an American, who of course ordered the camel. Twice.
Mahamane’s camel stew has an earthy flavor, enhanced by ground peanuts and shriveled spinach greens, and cooked to tender perfection with cumin, ginger, and saffron. It’s served on a bed of couscous.
The result is, for lack of a better word, delicious – creating a pronounced cognitive dissonance with the conjured images of this famously bad-tempered curmudgeon of a pack animal, its meat turned leathern by years of constantly battling its owner-friend – and surviving in the harshest of environments.
“The Americans all want it,” says Mahamane, shaking his head in surprise. “I don’t know why.”
Even in these parts, where the temperature routinely tops a stultifying 105 degrees Fahrenheit, eating camel can be a controversial practice – akin to the reservations some in the West have about eating meat from domesticated animals they once formed bonds with when living, such as dogs, cats, and horses.
With their soft snout thick with long bristly hair – just don’t get near the powerful teeth and tongue, which bite hard and can easily tear through thorn tree branches – and very long eyelashes designed to protect their eyes from blowing sand, camels have defined cultures of nomadic tribes from the western Sahara across Africa toward Somalia and Yemen and beyond.
“It’s an important animal for us, we have a special relationship,” says Mahamane, who describes the rituals involving camels of his nomadic Tuareg tribe of northern Niger and the Sahel. They have depended upon the long-legged, loping beasts to sustain life in such a hostile environment since camels were first domesticated by humans some 5,000 years ago.
“As Tuaregs, we rely on camels to carry us into the mountains and into the bush,” he says. During marriage ceremonies, there are often camel races, as well as a fantasia ritual, in which people stand in the middle and sing, while camels circle.
Some Tuaregs – such as Minat Alhou, who is himself a carnivore – believe that camel meat is imbued with medicinal powers of healing when “no other medicine will work.”
But not all will visit the butcher where thick hunks of camel meat are sold for $2.39 per pound, about 40 cents cheaper than beef.
“There are some Tuaregs who eat camel, and others who won’t, even if they like meat,” explains Mahamane. “Those who don’t eat it ask, ‘How can you eat the animal that works for you?’ ”
But cooking camel requires care, so that the dish doesn’t wind up as indigestible as the hardscrabble existence from which spring both camels and nomadic tribes.
“Camel meat is tough!” exclaims the chef, who warns against under or overcooking. He recommends making stew dishes from the haunch, where the mix of meat and fat is at its flavorful best.
Among the other cuts that are sold are camel tongue, filet, and organ meats. Ribs are cut up to make sauces.
But beware the age of the meat you are buying, warns Mahamane. The cost of young and old camel meat is the same. Optimal age for tenderness is two years old. But with camels living an average of 40 or 50 years, there are plenty of jaw-grinders in the market.
Mahamane, who is in his mid-20s, runs his kitchen with practiced mastery, and he learned the camel dish – his only one so far – during a three-month apprenticeship with a chef from the small West African state of Benin, where, ironically, camels are scarce.
He then studied under a Tuareg chef for a year, where he learned to create other specialties on the menu, including Tuareg-style mutton stew (slow-cooked with cumin), and spicy mutton tagines richly flavored with sauces of dried fruit or homemade pickled lemon.
Mahamane says he is considering new possibilities, especially as demand is high – especially among Americans – to eat these ships of the desert.
“As time goes by, I can imagine wanting to learn another camel recipe,” he says.
(Camel meat in spinach and peanut sauce)
Prep time: 20 minutes. Cooking time 35 minutes
Serves 8
Ingredients:
3 lbs camel meat
2 large onions
1 tsp each of salt, cumin, ginger
½ tsp black pepper
A large pinch of saffron
½ lb crushed, ground, or powdered peanuts, depending on desired texture
1½ lbs fresh spinach
2 tbsp palm or other vegetable oil
Ask your butcher to cut a piece of camel from the fattier part of the haunch. Cut it into bite sized cubes.
Heat the oil in an open pressure cooker, chop the onions roughly, and soften them in the oil over a medium heat.
Add the camel meat, and brown.
When it begins to give off its juices, add the salt, pepper, and spices. Cook for a few more minutes, then add the crushed peanuts.
Add enough water to cover the meat, seal the lid of the pressure cooker, bring it up to pressure and cook for 30 minutes. Remove the lid from the pressure cooker when it has cooled sufficiently (under the cold tap).
Separately, rinse the spinach well, detach the stalks, and wilt it quickly over a high heat in a lightly oiled wok or other pan. Remove the spinach from the heat, drain well, and chop it roughly.
Add the spinach to the camel stew and cook briefly, stirring well.
Serve with couscous, rice, or tagliatelle.
Peter Ford
Early this month, Iraq’s newly designated prime minister, Adel Abdul Mahdi, launched a website asking Iraqis to apply to be top government ministers. More than 36,000 people applied. Almost all were independent of established parties. Mr. Mahdi’s extraordinary move was contrary to a tradition in Iraq of politicians forming cabinets in backroom deals based on a power-sharing quota among Shiites, Sunnis, and Kurds. If those he names to his cabinet next week are capable administrators not beholden to party interests in gaining jobs for loyalists or revenue from bribes, it will reflect a new political maturity in Iraq. Since the 2003 ouster of Saddam Hussein, Iraq has made steady if uneven progress toward a national identity based on shared values and common interests. As scholar Francis Fukuyama writes in a new book, “Identity,” countries must organize their identity on broad-based rights, or a view of each individual as capable of self-rule. “Unless we can work our way back to more universal understandings of human dignity,” he writes, “we will doom ourselves to continuing conflict.” Iraq may be slowly heeding that lesson.
In a region rife with identity politics arising from ethnic and religious differences, Iraqis experienced something very universal in early October. A newly designated prime minister, Adel Abdul Mahdi, launched a website asking Iraqis to apply for top government ministers.
Within days, more than 36,000 people applied. Almost all were independent of established parties. And 15 percent were women.
Mr. Mahdi’s extraordinary move was quite contrary to a tradition in Iraq of politicians forming cabinets in backroom deals based on a power-sharing quota among the country’s three major communities: Shiites, Sunnis, and Kurds.
One prominent politician, Ammar al-Hakim of the National Wisdom Movement, praised Mahdi for trying to “choose those who meet the criteria of efficiency, firmness, integrity, and ability to serve and fight corruption.”
An even more influential figure, Shiite cleric Moqtada al-Sadr, whose party won the most votes in a recent parliamentary election, declared, “Is it not high time for qualified, independent technocrats [to assume posts in the coming government] in order to coexist in peace and security away from the dagger of treachery and corrupt deals?”
Next week, the prime minister-designate plans to announce his proposed cabinet. If many of the names are capable administrators not beholden to party interests in gaining jobs for loyalists or revenue from bribes, it will reflect a new political maturity in Iraq.
Since the country’s embrace of democracy after the 2003 ouster of dictator Saddam Hussein by the United States, Iraq has made steady if uneven progress toward a national identity based on shared values and common interests. That progress has been lately hastened by three events.
One was a recognition by the Shiite majority after last year’s victory over the Islamic State that it must treat minority Sunnis as equal citizens. The defeat of the Sunni militant group, which had taken over a third of Iraq in 2014, was made possible only by unifying Iraq’s major groups behind the nation’s security forces.
Another event was mass protests since July among young Shiites in Iraq’s second-largest city, Basra. Their demands were quite basic and secular: an end to corruption, a regular supply of electricity and clean water, and a curb on Iranian influence in Iraq.
About 40 percent of Iraqis were born after 2003. They have seen four successive changes in power by democratic means. They expect more of their leaders than a partisan contest to divide up the nation’s oil wealth.
The third event was a statement in September from the influential Grand Ayatollah Ali al-Sistani saying that the next prime minister should not be someone who has held the post before. The choice of Mahdi reflects a wide consensus among the parties. He is a former vice president and oil minister who shuns sectarianism in governance.
Many democracies are now highly divided over identity politics, from gender to religion to ethnicity. In Iraq, such divisions are even institutionalized as a power-sharing norm. It has not worked.
As scholar Francis Fukuyama writes in a new book, “Identity,” countries must organize their identity on broad-based rights, or a view of each individual as capable of self-rule. “Unless we can work our way back to more universal understandings of human dignity, we will doom ourselves to continuing conflict,” he writes.
Iraq may be slowly heeding that lesson. For all its turmoil, its people are shining a light for the rest of the Middle East.
Each weekday, the Monitor includes one clearly labeled religious article offering spiritual insight on contemporary issues, including the news. The publication – in its various forms – is produced for anyone who cares about the progress of the human endeavor around the world and seeks news reported with compassion, intelligence, and an essentially constructive lens. For many, that caring has religious roots. For many, it does not. The Monitor has always embraced both audiences. The Monitor is owned by a church – The First Church of Christ, Scientist, in Boston – whose founder was concerned with both the state of the world and the quality of available news.
Out of food and money, today’s contributor found comfort in the Bible, which brought her a newfound confidence in God’s care for all. Very soon, her family’s immediate needs were met in unexpected ways.
When we first moved to a farm in Indiana in 1972, our family had to manage on very little. We had two mortgages because our house in another state had not yet sold. And my husband, Dwight, and I had used up our savings on his graduate school studies. We’d borrowed the down payment for the farm from a family member.
I stayed home with the kids while my husband worked at a children’s museum. But his job offered only half the salary he’d been making while in the armed forces. When we’d moved in late summer, we thought the garden could be an immediate source of vegetables. Our plan was to grow all of our own food. But the previous owner had allowed the garden to go to weeds, so it needed a lot of work.
I remember one morning very clearly. It was in September, three days before payday. We’d already run completely out of money. There was absolutely no food left, not even a cracker, not an egg.
As our family had often done in times of need, I quietly began to pray, affirming the goodness and abundance of God, His great love for each of His children, and everyone’s right to be blessed with good. I began to think about the Bible stories I’d learned in the Christian Science Sunday School that illustrated God’s ample supply, or provision.
For instance, by following the prophet Elisha’s inspired directions, a woman was able to pay off her debt with a pot of oil that never ran out (see II Kings 4:1-7). I also thought about a remarkable account of Jesus feeding thousands with just a few loaves and fish (see Matthew 15:32-38). Then I recalled the 23rd Psalm, which speaks of the Lord as our Shepherd. We’re promised that our cup will run over and that there will be a table prepared for us in the presence of our enemies.
It became clear that although times had certainly changed since then, the “enemy” was still the belief that something could challenge God’s supremacy and constant care. And it was this I was facing in my prayers.
Those Bible accounts spoke to me in a very real way. Soon I began to feel more confident that our Shepherd, our Father-Mother God, is surely looking after all His children and providing the inspiration and answers that meet our needs. I felt an expectation infused with spiritual promise.
In a little while there was a knock at the door, and there stood Uncle Ronald, who lived about 30 miles away and seldom dropped in. This morning, however, he’d come with a gallon and a half of fish he’d caught and wanted to share with us!
Soon another knock came at the door. It was a new friend who had been baking bread that morning and brought us a fresh, warm loaf. As she left, I thought of the Bible story of the loaves and the fish again. I was seeing God’s provision in a way that fit where we were – in modern times – but was a real-life illustration of the divine care Jesus had demonstrated so long ago.
My heart was brimming over with gratitude. Then a third knock came. A mom in our son’s kindergarten carpool came by to drop off a huge grocery sack full of little-bitty cherry tomatoes she had taken the time to pick for us from her garden.
I came to know from that experience what it means to have our “cup run over.” Our family had enough bread, fish, and tomatoes for three days, and then, with my husband’s paycheck, there was grocery money again. We never again ran completely out of food. In fact, within that year our first house sold, we paid our family back, and I opened a nursery school at the farm. Dwight’s salary and responsibilities continued to grow. In the next few years our garden became so plentiful that we had more than enough to share with neighbors and sell. We grew our own food for the next nine years. And we improved our house and eventually sold the farm for much more than we’d paid for it.
The largest lesson we learned was to trust God completely. That glorious feeling of expectation and faithful wonder that God was going to care for us – instead of an anxious sense of worry about if we would have enough – has never really left me.
Adapted from an article published in the July 20, 2009, issue of the Christian Science Sentinel.
Thank you for accompanying our exploration of the world today. Please come back next week, when we will publish the second part of our new weekly series on global migration. We will look at how Jordan soldiers on as the host of a million-plus Syrian refugees even amid international donor fatigue.