Are Alabama cities the “most dangerous”? Critics pounce on rankings
Is Gulf Shores the ninth most “dangerous beach” in the United States?
According to the website “Travel Lens,” it is. In the past few weeks, Bessemer, Mobile and Birmingham are all considered the “Most Dangerous” cities in the U.S.
Once again, Alabama finds itself high atop the rankings. But this isn’t college football or basketball, and local officials are agitated.
Criminal justice and journalism experts say officials mostly have a point, and that lists calling out cities, beaches or anywhere else as “dangerous” are produced with questionable methods and arbitrary data.
The companies that put them together stand by their work, though they understand why the results have their critics.
Eleven years ago, the FBI was the chief critic. The federal agency called out the rankings for “simplistic and/or incomplete analyses” that leads to “misleading perceptions adversely affecting cities and counties, along with their residents.
Other agencies have warned in the past about the rankings. In 2007, the official policy position from the American Society of Criminology’s executive board was to oppose the use of Uniform Crime Reports (UCR) from the FBI to rank American cities as “dangerous” or “safe” without considering the limits of the data.
Four years later in 2011, the U.S. Conference of Mayors denounced the release of city-by-city crime rankings as a misuse of FBI crime data, noting that rankings “can inflict” damage on cities.
The number of “safest” or “most dangerous” lists quelled somewhat around that time, criminologists say. But that is no longer the case, and any number of websites are pushing them out fast and furious in what critics say is a brazen attempt to lure eyeballs to their websites, which could lead to advertising dollars to boost their bottom lines.
“I understand the hope for good crime data, but a ranking never provides useful information,” said Kelly McBride, chair of the Crain Newark Center of Ethics and Leadership at the Poynter Institute. The St. Petersburg, Florida-based non-profit specializes in journalism ethics training, fact-checking, and media literacy.
“I can’t find one outcome from a (crime ranking) that has been positive,” said McBride. “I just can’t.”
‘Racist effect’
The rankings are harming public perception about cities, critics contend. And most of those cities, they argue, are majority Black and poorer communities.
“It keeps away tourists and it creates unnecessary worries,” said Janet Lauritsen, Curators’ Distinguished Professor Emerita in the Department of Criminology and Criminal Justice at the University of Missouri-St. Louis, and a recent past president of the American Society of Criminology. “It hurts small businesses in those cities and efforts to create turnarounds for their neighborhoods. The list just goes on and on.”
McBride said they have a “racist effect” on cities.
“You have these predominately Black communities being portrayed as more violent when the data they are using is unreliable, uneven and unfair,” she said.
Ranking companies say they are providing useful information through a deeply researched set of statistics.
WalletHub, one of the leaders in providing an assortment of rankings, strives to be as transparent as possible with their studies. On their site, readers can find a laundry list of datasets the company uses to determine their rankings. For instance, in last year’s “most safest cities” ranking, WalletHub listed a host of measurable data categories it mixed in with crime data: Identity-theft complaints per capita, foreclosure rate, debt-to-income ratio, shares of households with emergency savings, and much more.
In that ranking, Mobile was the 93rd “safest” city out of 182 cities WalletHub measured – far from the No. 2 “most dangerous” that San Francisco-based personal finance technology site MoneyGeek tagged on Alabama’s Port City earlier this month.
Montgomery was No. 100 in WalletHub’s survey and Birmingham was 167.
Diana Polk, spokeswoman with WalletHub, said their company is aware of the criticism.
“We are aware that not everyone may agree with our findings, but we strive to produce well-researched, transparent studies, and we always list our sources and methodology,” she said. “We put a lot of work and stand by what we do, but at the end of the day, each person has the right to their own opinions.”
Debating the data
But it’s the mixing of those statistics – including the use of financial data with FBI crime statistics — that is creating outrage.
The FBI data, itself, is also in flux. As McBride wrote in October, a new National Incident-based Reporting System (NIBRS) was incorporated into the 2021 annual crime data replacing the FBI’s Uniform Crime Reports (UCR) that had been used for years. She wrote that the switch “will be fertile ground for those who want to distort or exaggerate crime trends for political or commercial reasons.”
Mobile found itself in that situation earlier this month, and the timing could not have been worse after MoneyGeek’s rankings emerged. Only St. Louis is considered more “dangerous” than Mobile, based on a ranking that inputted a dollar value on crime. The ranking was published by Forbes and released at the same time Mobile officials were attempting to quell public concerns about going downtown for Mardi Gras to watch the parades in the aftermath of a deadly shooting that occurred on New Year’s Eve.
MoneyGeek’s data set also relied on 2021 FBI statistics, the first to begin utilizing a new reporting system for crime.
Mobile officials believe their crime data was inflated, and some of the statistics raised utilized by MoneyGeek raises obvious questions. For instance, MoneyGeek’s data set indicates 485 rapes happened in Mobile in 2021, as opposed to 65 in Birmingham. That would be a whopping, and unexplainable 153% difference between the two cities.
MoneyGeek’s analysis also points to 111 homicides occurring in Mobile in 2021, which would smash the city’s all-time records. Mobile police, however, have long reported 51 homicides in 2021.
“I trust our city’s statistics more than a third party,” said David Clark, president & CEO with Visit Mobile. “People can take certain numbers and then make them to what they are trying to make them to be. But I think, lately, violent crime is down in Mobile. The most trusted source is always the source closest to the source, and that’s our city.”
Change in FBI data
Nicholas Zeitlinger, digital public relations manager with MoneyGeek, said they were simply utilizing the most updated data they could find to produce their report. The numbers for Mobile illustrated figures they obtained within the FBI’s 2021 annual crime report.
That’s raising questions on whether cities were compared on an apples-to-apples basis.
Mobile Police Chief Paul Prine said his agency was investigating how their numbers were submitted to the FBI by the Alabama Law Enforcement Agency, and he believes it might have been through the NIBRS method. That system can count multiple crimes occurring during one offense, as opposed to the UIC system that counts only one crime – the most serious offense – during an incident.
“One of the biggest changes is that often times, when a crime is committed, there are several other crimes,” McBride said. “You might have a carjacking and an assault and a robbery and an assault on an officer. It’s four crimes.”
She added, “Previously, what was counted was only the most serious crime, and everything else was not counted as an individual crime. Now, all those crimes are counted.”
And not every city is using the new system, but it is unclear which cities are using NIBRS versus UCR. Also uncertain is which cities are even reporting their data to the FBI altogether.
“In comparing community to community, there are assumptions that every law enforcement agency is gathering their records in the same way,” McBride said. “We know that this is absolutely not the case. We know what might be counted as felony sexual assault in one community will be completely shelved and not even counted as a crime in another community.”
She said that 40% of law enforcement agencies turned in their data initially to the FBI.
“Most of the largest cities are not included in these data sets because they create their own data reports,” McBride said. “Chicago and Los Angeles are not included (in the FBI annual data). And in comparing community to community, it makes the assumption that law enforcement is gathering their records the same way. We know that is absolutely not the case.”
MoneyGeek said it analyzed 263 cities, or 80% of those in the U.S. with populations over 100,000 residents. Huntsville and Montgomery were not part of the analyses.
“So, when you make a list like this, the caveat is the 60 percent of the municipalities are not even included in this,” McBride said. “You are essentially penalizing communities for turning in their data. It’s deeply unfair journalistically.”
Explaining the methodology
MoneyGeek said its methodology, which derived a cost of crime per capita, utilized research from two University of Miami professors and a professor at the University of Colorado, Denver, who produced economic costs for specific crimes. The cost of crime methodology varied, based on the severity of crime. For instance, they assessed murder at $9 million, while larceny was $3,500.
Chris Roberts, an associate professor of journalism at the University of Alabama and a former database news editor at The Birmingham News, said MoneyGeek did an adequate job in explaining its methodology.
He said law enforcement leaders can quibble about the datasets and raise questions whether economic costs should be applied to certain crimes. But he noted that it’s “something insurance companies do every day.”
“You have to admire, whether you believe the results or not, the information provided to the audiences,” said Roberts. “They used college professors and explained their methods and drew those conclusions. You can disagree with those conclusions and argue that the data is old, but (the 2021 FBI data) is the most recent data available.”
Lauritsen argues that the source of these studies is problematic. She said ratings often are derived by websites purporting to be financial sites and are likely done by researchers without criminology backgrounds.
“I’m not impugning anyone’s particular motives,” she said. “They don’t know what they don’t know about crime data. But you cannot imagine someone doing something similar for a disease – that this city is the most (dangerous) for a disease, without knowing how diseases are classified.”
Dive deeper
Critics also argue the data does not dive deeper into the cities, which are large and more complex than the datasets suggest.
“We know there are more dangerous places within a city, but what the rankings do is label an entire city as dangerous,” Lauritsen said. “And it’s just not city conditions that affect risk.”
Roberts, at the University of Alabama, agrees.
“A city is a big place,” he said. “We know some parts of cities are less dangerous than other parts. Commercia districts are different from retail districts and are different from single-family housing and other types of housing. To lump a big city into one thing is almost always problematic.”
Roberts said the problem with the rankings is that they often lack any journalistic reporting, such as the inclusion of quotes from local authorities or from police chiefs and other crime experts.
“Media is now selling nationally, and so a story a lot like this is drawing (web) hits nationwide,” said Roberts. “A story I did 25 years ago (on crime statistics) for the Birmingham News is aimed at a more local audience. You will see that I’m quoting experts in the Birmingham area. It’s a local story and the police had opportunities to comment.”
McBride said there are fewer journalists covering cities around the country than there were 15 years ago, making it difficult for local officials to provide reactions whenever they find themselves on the wrong end of a ranking.
“This is a media pollution problem,” she said. “When you have pollution, you need better information. But the media is thinned out.”
Robbyn Taylor, a journalism professor at Troy University, said it’s important for the public, before assuming a city is “the most dangerous” or a beach is filled with assorted dangers, to “drill down into the websites” to learn more about their methodology.
Influence peddling, she said, is a concern when attempting to find out who might be financing a study.
“Is it funded by a political group? Is it funded by a group that is backed by an industry? If you find those things to be true, it’s a red flag and they are trying to have you buy into what they are saying,” Taylor said.
Beach dangers
Some of the rankings do not resonate with the public. That appears to be the case in Gulf Shores, where officials found their beaches among the “most dangerous,” according to the Travel Lens rating.
The company’s methodology included the number of shark attacks and surf zone fatalities since 2010, intermixed with the number hurricanes since 1851.
Only eight surfing-related and shark fatalities were accounted for in the Travel Lens analysis.
“It seems an odd mix to include 169 years of hurricane impacts with 12 years of surf fatalities and shark attacks,” said Beth Gendler, president & CEO with Gulf Shores & Orange Beach Tourism. “Those data sets logically don’t go together because of the vastly different time spans being considered.”
Travel Lens, whose writers are based in the Philippines, did not respond to an email request for comment.
“We pay attention to any list we are included in, especially those that might cause our destination to be viewed negatively by potential visitors because it is important that we can counter any false or misleading information if needed,” Gendler said. “And our visitors will tell us when we need to address an issue of misleading information … we will get calls or emails asking specifically about the topic.”
She added, “So far, that cue has not happened, which tells us this list has not had an impact.”