Filter | only charts
Niam Yaraghi, Brookings Institution - October 24, 2014

Disclosure of the financial relationships between the medical industry and health care providers is a very important step toward transparency. Patients heavily rely on the recommendations of their doctors to make any kind of decision regarding their health and thus should have full awareness of payments between their doctors and the medical industry. Patients have a right to be informed about possible conflicts of interests.

A not so well-known provision of the Affordable Care Act is the Sunshine Act. The purpose of this act is to increase the transparency in the health care market by requiring doctors, hospitals, pharmaceutical companies, and medical device manufacturers to disclose their financial relationships. Mandated by the Sunshine Act, on September 30th, Centers for Medicare and Medicaid Services (CMS) publicly released the first set of data, under the Open Payments title. This data includes $3.5 billion paid to over half a million doctors and teaching hospitals in the last five months of 2013.

A subset of Open Payments data that is individually identifiable includes two categories of payments. The first category are the payments that are made for other reasons such as travel reimbursement, royalties, speaking and consulting fees and the second are payments which are made as research grants. These datasets together include more than 2.3 million financial transactions which amount to a total of more than $825 million.

Total Payments by Manufacturers of Drugs, Medical Devices, and Biologicals

General Payments
Teaching hospitals and physicians together received $669,561,563 in general payments from 949 different medical manufacturers. Interestingly, close to 70 percent ($460,369,403) of this amount was paid to individual physicians and the rest was paid to teaching hospitals. More than half of the total general payments were made by only 20 companies led by Genentech, which paid $130,065,012 in general grants to various hospitals and doctors and in particular, City of Hope National Medical Center.

Research Payments
Two hundred and ninety-four manufacturers awarded 23,225 research grants to teaching hospitals and physicians. The total value of these grants was $155,815,828. About 70 percent ($107,969,961) of these grants were awarded to teaching hospitals and the rest were awarded to physicians. The top 20 manufacturers contributed more than 75 percent of the total value of these grants. By awarding $17,973,563 in research grants Bristol-Myers Squibb, leads the pack.

The following chart breaks down the payments of the top 20 most generous manufacturers of drugs, medical devices and biologicals to teaching hospitals and individual physicians.

Not surprisingly, the release of the payments data was not immune from criticism. The harshest ones were from the American Medical Association (AMA). In particular, the AMA cited "inadequate opportunity for physician review" and "inaccuracy of the data" as the main problems with the release of open payments data. Moreover, AMA was so concerned about the "misinterpretation" of the data that it released an official "Guide for Media Reporting" in which it "strongly encourage[s] members of the media to ... help the public understand the important role that appropriate relationships between physicians and industry has in advancing the practice of medicine."

It's critical to interpret financial transactions between different medical providers in context. Obviously, accurate and complete data is preferable to inaccurate and incomplete data. However, it's not possible to avoid any of these problems by withholding data any longer.

Misinterpretation of the Open Payments data is an unfortunate yet inevitable problem. Given the volume and inherent complexity of this data, patients and media will have a steep learning curve ahead of them before they fully understand the nuances of the open payments. The AMA argues correctly that medical providers are best suited to educate patients and media representatives about this data. The Open Payments data may create questions but the lack of response to these questions will result in misinterpretation.

The AMA claims that the technical problems with the website inhibited physicians to verify this information in the 45-day window provided by CMS. This may be true, but it's worth noting that a subset of hospitals and physicians accounted for 75 percent of the total payments. The window (which was then extended) may not have given all medical providers enough time to adequately check this data and inform CMS about possible errors, but this still allowed enough time to audit those limited instances of doctors and hospitals that were paid the largest sums of money. Moreover, CMS has already adopted a conservative approach and has removed data for about one third of the payments due to possible errors. Those providers who received eye-popping sums of money from the pharmaceutical companies and medical device manufacturers have already had abundant opportunities to verify this information and possibly provide the context to media outlets.

Grants Awarded to Teaching Hospitals

General Payments
Approximately 1,110 hospitals received $209,192,160 in general payments. Twenty hospitals received 82 percent ($173,032,320) of these payments. City of Hope National Medical Center received a total of $122,586,713 and received more in general payments than any other hospital.

Research Payments
Six hundred and sixty different teaching hospitals were awarded in $107,969,961 in research grants. Sixty-one hospitals or 9 percent of the total were awarded more than 75 percent ($81,241,699) of the total research grants. Dana Farber Cancer Institute received $14,455,932 in research grants from various companies, which is more than any other teaching hospital.

Grants Awarded to Individual Physicians

General Payments
Nearly 466,000 physicians received $460,369,403 in general payments. About 10 percent ($42,555,940) of this amount was paid to the top 20 doctors, mainly as royalties and licenses. At the top of the list is Dr. Stephen S Burkhart, orthopedic surgeon from San Antonio who received $7,352,788 in royalties from Arthrex, a manufacturer of orthopedic surgical supplies.

Research Payments
he 3,744 doctors in the database received $47,845,867 in research grants. The top 10 percent of doctors received about 71 percent ($33,917,206) of these grants. Dr. Stuart S. Winter, the pediatric hematologist/oncologist from the University of New Mexico, received $2,886,844 from GlaxoSmithKline, for conducting two research projects on Arranon, a brand name drug for treating specific types of cancer.

This provision of the ACA offers valuable insight into data that was not previously shared with the American public, which is an important step toward broader transparency into the U.S. health care system.

Niam Yaraghi is a fellow in the Brookings Institution's Center for Technology Innovation. This piece originally appeared on the Brookings Institution's TechTank blog.

Why College Students Don't Learn Much

Michael Poliakoff, ACTA - October 24, 2014

"What Will They Learn?" asks the American Council of Trustees and Alumni's (ACTA) annual study of college core requirements. And given what we uncovered this year, it is no surprise that -- as famously documented by Richard Arum and Josipa Roska in Academically Adrift: Limited Learning on College Campuses -- many students graduate without having learned much of anything.

Our methodology is simple. We look at whether an institution requires its students to study seven basic subjects: literature, U.S. government or history, foreign language, mathematics, economics, science, and composition. And what have we found? Only 13 percent of the nearly 1,100 schools ACTA evaluated require the equivalent of three semesters of foreign-language study. With all the national buzz about our need for more STEM education, fewer than two-thirds of schools require college-level math. In a globalized economy frequently beset by economic crises, just 3 percent require even a single course in basic economics.

In total, only 23 schools require at least six of these seven subjects.

What are students taking instead? There is a cornucopia of the enticing and nugatory. For example, at Harvard, students can fulfill their literature requirement with "American Dreams from Scarface to Easy Rider." At the University of Colorado-Boulder, students can take "Horror Films and American Culture" or "America Through Basketball" in lieu of an American-history course. The absence of strong general-education requirements has allowed too many students to replace intellectual rigor with the academic equivalent of junk food.

This curricular decline comes at a cost, producing citizens unable to compete globally or exercise responsible citizenship. Recently, the Organization for Economic Cooperation and Development (OECD) surveyed its member nations to determine the level of quantitative and verbal literacy that adults demonstrate. While America spends substantially more per student on higher education than any other OECD nation, we are far from the top when it comes to performance. The literacy level of four-year college graduates is below the average of our international peers. And numerous surveys have confirmed that American college graduates have stunning gaps in their knowledge. Nearly 62 percent could not identify the correct length of congressional terms. Thirty-nine percent didn't know Franklin Roosevelt was president during World War II.

And while the purpose of a liberal education is not merely to produce effective workers, employers have repeatedly emphasized that they wish college graduates possessed greater knowledge of foreign languages, science, and civics.

Given these facts, why are colleges consistently failing to provide the broad-based liberal-arts education Americans need? There is plenty of blame to go around. Students and parents have looked for prestige and reputation instead of educational quality in making college choices. Professors often prefer to teach courses in niche subjects that interest them, rather than in the subjects that are vital to students' success. Administrators use open curricula and catchy course titles to attract applicants in the fierce competition for students and their tuition dollars. And trustees fail to exercise oversight over their institutions.

The time has come to say enough is enough. It is time for every higher-ed stakeholder to reaffirm the value of a strong core curriculum. Students, parents, and donors need to vote with their feet and wallets, attending institutions and handing over money to schools that provide a solid curricular foundation. Professors need to remember that they are charged with molding informed citizens as well as with conducting research. Administrators and trustees need to insist on a course of study that will ensure every student learns the essential subjects that will prepare him or her for career and community. And policymakers need to insist that institutions which receive taxpayer dollars are equipping young people to be productive members of society.

The time for mediocrity is over. America's colleges and universities have been called the "envy of the world." If we restore the core, they can keep that moniker.

Michael Poliakoff is vice president of policy at the American Council of Trustees and Alumni.

Replacing the ACA: What Women Want

Grace-Marie Turner - October 24, 2014

Americans are angry about the manifest failure of Obamacare. Many have felt the impact directly, from the millions who lost their health plans after being promised they could keep them, to those facing sky-high deductibles and premiums, to those -- including some in the midst of cancer treatments -- who are losing access to their family doctors.

But at the same time, Americans want insurance that is secure and covers treatments they need. They want the uninsured and those with preexisting conditions to have access to coverage, and they know costs are rising dramatically. They know a safety net is needed for those who lose coverage, particularly those undergoing treatments for serious diseases.

Americans also know the uninsured, those on Medicare and Medicaid, and workers who have insurance through their jobs have different needs and require different solutions. And they know some ideas -- eliminating mandates on employers and individuals to give them greater flexibility, devolving power to the states when it comes to managing Medicaid, allowing patients to select their own coverage, cost transparency, and genuine competition on benefits and price -- show real promise.

Those who want to repeal and replace Obamacare need to understand what Americans -- and especially key demographics like politically independent women -- want from health-care policy. Toward that end, my organization, the Galen Institute, is conducting a series of focus groups, each comprising about a dozen women. We begin with a 20-minute summation of our ideas and goals, during which the participants communicate their response through "dial tests" in real time. Then the presenters go behind the two-way glass to watch as the moderator elicits feedback and reactions.

Women are very open to new ideas that could give them more choices when it comes to doctors, hospitals, and care. But the vein of compassion is strong, and our solutions will be buried if we don't show we care.

A young woman participating in the most recent focus group said she had had health insurance for her family through her husband's job, but "because of Obamacare," premiums soared: The policy's price tag rose from $400 to $900. The woman decided to keep her husband and children on the policy, but to go without herself. She's uninsured because of Obamacare, but -- and here is the most important message -- she said that's okay with her if it means a child with cancer now can get treatment who otherwise couldn't have. Most of the other women nodded in agreement.

These women understand the damage that Obamacare is doing, but they do not want to see people left behind. Talking of "repeal" alone won't work. If millions of people are getting health coverage now through Obamacare subsidies, our policies must have a safety net that continues to protect them.

We definitely have ideas to do that. People should be able to select the benefit packages that they want, not those they are forced by the government to buy. Americans would have a much broader choice of more affordable plans if there were genuine competition among health plans and insurers on benefits and price.

Health insurance should be portable so people don’t lose their policies when they lose or change jobs, and people should have guarantees that if they get sick, their premiums won’t soar.

Employers should have more, not less, flexibility to offer the coverage that suits their workers and their budgets without the threat of huge fines and penalties if they don’t comply with an avalanche of government regulations. And governors should be given more authority to structure their Medicaid programs to make sure their most vulnerable citizens have access to care.

A poll of voters in swing states taken by Public Opinion Strategies for Independent Women’s Voice found that 58 percent voters report that either they (21 percent), a member of their family (35 percent), or a friend (24 percent) have been affected by the ACA. Among the 58 percent who report an impact, negative impacts (60 percent) outnumber positive impacts (38 percent).

Voters will be looking for new solutions. But it doesn’t matter how right our policy ideas are if people don't understand them and if people don't think we care about how they have been harmed. Conservatives must explain our ideas in positive, human terms so people can see that freedom, not government, provides their ultimate security. Our ideas connect when we talk about the real needs of people, including the power of choice and competition to put the American people in charge of health reform that works for them.

Grace-Marie Turner is president of the Galen Institute, a think tank specializing in free-market ideas for health reform.

Blended Families Pose Challenges for States

Teresa Wiltz, Stateline - October 23, 2014

The Great Recession walloped almost every segment of American society. Millions lost their jobs, homes and businesses. Families lost trillions in household wealth. But a new study by a U.S. Census demographer finds that one group was hit hardest by the big downturn: "multiple-partner fertility" families, or families in which a woman has conceived children by more than one man.

The number of families with multiple-partner fertility, or MPF, is growing across all class and education levels. But they are more likely to be poor, uneducated and minority. They are people who have started life with significant disadvantages—and they tend to stay disadvantaged.

The complexity of their lives poses interesting challenges for state policy. A few state legislators are just starting to grapple with how to best help these families, from tweaking child support requirements to encouraging fathers' involvement in their children's lives.

The census study, conducted by Lindsay Monte earlier this year, found that MPF families spent significantly more time in poverty during the downturn and relied on food stamps much more than single-partner families did. "MPF families did experience disproportionate difficulties during the Great Recession," wrote Monte, a demographer with the Census Bureau's Fertility and Family Statistics Branch. (These families were not more likely during the recession to rely on welfare payments such as Temporary Assistance for Needy Families, the study found.)

MPF families have been around forever, largely the result of divorce and remarriage. But the phenomenon is growing across the U.S. as fewer Americans decide to marry, a trend that is particularly pronounced among millennials. More than half of all births to parents under age 35 are outside of marriage. Twenty-eight percent of all U.S. women with two or more children have children by different men, according to Cassandra Dorius, assistant professor of Human Development and Family Studies at Iowa State University.

"Today, we postpone marriage, but we don't postpone children. That's different from 50 years ago, when you got married and then you had your kids. That's not the case anymore," said Dorius, who conducted a national study of women at or near the end of their childbearing years to assess the impact of multiple-partner fertility on their lives.

Elizabeth Peters, director of the Urban Institute's Center on Labor, Human Services and Population, said multiple-partner fertility puts kids in households that are more chaotic, and sometimes makes contact with the noncustodial parent more difficult. That makes it less likely they will receive the kind of parental investment that will help them grow into productive, happy adults. "These kids are just inherently more disadvantaged. That's a policy issue. I don't know what the policy solution is. You can't legislate how people build families."

Portrait of an American Family
Researchers first began studying multiple-partner fertility in 2005, almost by accident. They started out by exploring how some men "swap families" after they have a child with a new partner. These men sometimes limit their financial support of their previous children or stop spending as much time with them. They also might limit involvement with their children that live elsewhere when their ex-partners move in with or marry someone new.

Most of the emerging research is on the lives of mothers who bear children by more than one man, rather than on the fathers who have children with more than one woman. The vast majority of MPF women in their late 20s and early 30s live with all of their children, while the vast majority of MPF fathers do not live with all of their children, according to research conducted by Karen Guzzo, associate professor of Sociology at Bowling Green State University.

MPF is more common among minority women, with 59 percent of African-American mothers, 35 percent of Hispanic mothers, and 22 percent of white mothers reporting having children by more than one partner. But the issue is less tied to race than it is to class, Dorius said. Black families are more likely to be poor, for example, and African-American men experience a much higher incarceration rate than any other racial or ethnic group, leading to a shortage of available men.

"Very rich women and very educated women also have children with more than one man," Dorius said. "This isn't necessarily a racial story. (MPF) is really concentrated in early disadvantage and in America today, that's caught up in race."

According to Dorius, women who conceive children with multiple partners often get pregnant at a younger age and are not living with their partner when they give birth. They are also less likely to have support from their families, have less access to child care and report poorer quality relationships, according to Dorius.

MPF mothers also have more health problems and depression as they reach middle age. Stress tends to breed more stress: Young couples faced with early parenthood are at risk for breaking up and meeting someone else, and then having children with their new partners.

MPF impacts families across generations as well. Teens growing up in a home with half-siblings are more likely to use drugs and start having sex earlier than other children, according to Dorius.

"These children have to juggle across households. Every Christmas, every Thanksgiving, every holiday" is complicated," Dorius said. "It makes family life really confusing. You can imagine a scenario where you have a mother and a father living together and with one child, he's the (biological) father and with the other, he's the stepfather. It makes for really complicated family life."

Family scholars are trying to uncover whether having children with more than one person magnifies inequalities over the course of the mothers' lives, Dorius said.

Fewer Marriages
The higher your educational level, the more likely you are to have a child while married—and stay married. Couples faced with economic insecurity are less likely to want to stay together or marry, according to Arielle Kuperberg, assistant professor of sociology at the University of North Carolina at Greensboro.

"It makes it harder for people to plan for the future; it makes them less willing to settle down. Marriage is something that happens after they achieve financial stability. In the past, marriage was the path to financial security," said Kuperberg.

The rocky economy and the erosion of stable manufacturing jobs have contributed to the low marriage rates. A few decades ago, a recent high school graduate could easily find a good job paying a living wage, one that allowed for buying a home and raising a family.

Today it's much harder for young people to find steady work, even if they have college degrees. As the manufacturing sector has shrunk, more of the jobs that are available are seasonal or have irregular hours or don't pay much, none of which are conducive to stable family life, according to Kuperberg. Full-time jobs with benefits are becoming more elusive as workers on the lower end of the job ladder have to cobble together multiple jobs and multiple shifts to make ends meet.

"If the economy continues to be very restrictive for young adults, as it gets harder and harder for young adults to establish themselves, we'll see more unwed fertility, and more multi-partner fertility," Kuperberg said.

States' Response
Economic policies that work to ease the unemployment rate would help to raise marital stability, Kuperberg said. Legislation that curtails erratic, "on-call" schedules for shift workers also would increase family stability, she said.

State policies tend to focus on some of the issues surrounding MPF, rather than the phenomenon itself, according to Rochelle Finzel of the National Conference of State Legislatures. MPF families often rely on child support, which is an important source of income for custodial parents, specifically those with low incomes, Finzel said.

"States play a strong role in child support from the beginning of the process to establish the order to collecting and enforcing those orders. The child support program is continuously evolving to meet the needs of custodial parents and is increasingly looking at ways to better engage with noncustodial parents to ensure they are able to provide consistent and reliable support," Finzel said.

Still, cautioned Dan Meyer, a professor in the School of Social Work at the University of Wisconsin-Madison, "child support policy is particularly difficult. It's not just how to divide up the payments that are made; it's how much we should expect" noncustodial parents, usually fathers, to pay under child support orders.

Often, a father with children by different mothers will end up paying a disproportionate share of child support: if a father has two noncustodial children by the same mother, he would pay less for the second child. A father with two different children would be required to pay the same amount for each child.

While the bulk of legislation has focused on collection and enforcement measures, many states have taken a direct look at military families to make sure the child support program meets their unique needs, according to Finzel. Legislators are also talking more about how child support program can better help noncustodial parents (often fathers) to help them find employment so they are able to pay their child support obligations, she said.

There are limits to what policy can achieve, Dorius said, and could potentially run the risk of needlessly interfering in how people choose to build their families. Ultimately, she said, most women who've had children with multiple men end up marrying one of their partners and forming stable marriages.

"We tend to think of women with MPF as being only poor single women with little education and money, and it is true that they tend start out that way," she said. "But these women represent a large segment of American society, and over the decades that we studied these families, most of the women married, went to school, and held full time jobs; basically, doing all the things you're supposed to do to live the American Dream."

Teresa Wiltz is a staff writer at Stateline, an initiative of The Pew Charitable Trusts, where this piece originally appeared

The International Tobacco Tax

Michi Iljazi - October 22, 2014

Earlier this month in Moscow, the World Health Organization (WHO) Conference of the Parties 6 (COP6) was held, with officials from around the world discussing a wide range of issues. One of those issues was an attempt to impose international taxes. We at the Taxpayers Protection Alliance (TPA) signed on to a coalition letter urging the attendees to stay away from any new tax proposals that could harm the economies of nations across the globe.

What happened at the conference has given rise to even more concerns, both about taxation and about transparency. While the United States is not bound to anything being proposed, the tone being set on the international stage should worry taxpayers.

The attendees adopted a proposal for a 70 percent global excise tobacco tax -- after ensuring that tobacco farmers and media observers had been removed from the conference. Both the World Farmers Organization and INTERPOL were denied access prior to the vote actually taking place. No record was kept of what was said.

As Washington Times columnist Drew Johnson reported:

A tobacco reduction conference hosted by the World Health Organization, the United Nation's public health agency, took a hostile and alarming turn on Monday when the public was kicked out of the meeting. . . . Delegates from more than 175 countries who are part of the Framework Convention on Tobacco Control, a UN global anti-tobacco treaty, agreed unanimously to boot spectators. Delegates then voted to ban the public from the Moscow conference center where the event is taking place for the duration of the week-long meeting.

"We don't need the public here!," proclaimed Uganda's representative. Libya's chief delegate Mohamed Ibrahim Saleh Daganee gritted his teeth as he demanded other delegates join him in voting to close the meeting to the public. "We don't know who these people are," complained Mr. Daganee, a former health information director under Muammar Gaddafi.

TPA has reported on tobacco taxes both here and abroad. Imposing (and increasing) tobacco taxes incentivizes black-market activity, costs businesses sales, and often generates far less revenue than expected. The recent coalition letter summed up the position of taxpayer groups from nations all over the world:

These international threats to tax sovereignty are real and they are expanding. Such policies would disproportionately hurt lower income people across the globe as the cost of consumer products would increase. Attempts at establishing international tax regimes would inordinately expand the reach of the EU and the UN. As leaders of groups that support free and open markets and tax competition, we oppose any effort on the part of any international body to levy further taxes on hardworking families.

Meeting secretly to plot to increase taxes is not the way to ensure a thriving global economy and robust participation from the United States and other countries. This is more a recipe for economic disaster.

TPA has been wary of the COP6 from the start. Unfortunately, some of the worst fears about the agenda were confirmed as the conference unfolded. Decisions on taxation should be left to individual nations, not to the World Health Organization.

Michi Iljazi is communications and policy manager for the Taxpayers Protection Alliance.

Robert VerBruggen - October 21, 2014

Last week, we reprinted a ProPublica piece analyzing police killings in detail. It made the striking claim that, on a per capita basis, black males in their late teens are killed at a rate 21 times that of their white peers.

This is a different picture from the one painted by Vox earlier this year using some of the same data. Vox found that blacks were 32 percent of those killed by police -- right in between blacks' proportion of the general population (about 13 percent) and blacks' proportion of homicide offenders whose race is known (52 percent).

The key variable is age: The ratio of blacks to whites killed by police is much higher among teens than among older adults. And so is the ratio of blacks to whites identified as homicide offenders, a decent measure of involvement in serious violent crime.

I downloaded all of the FBI Supplementary Homicide Reports (the data set used by both ProPublica and Vox) from 2007 through 2011, and pulled the ages, races, and ethnicities of those killed by police or identified as homicide offenders. (For offenders I excluded justifiable homicides, accidents, and negligence. Also, in some cases more than one offender is listed, but hunting those secondary offenders down through computer code would have been fairly elaborate, so I left them off.) I focused on those who were identified as white but not also as Hispanic, and those identified as black. The ethnicity data are spotty, so this probably still counts some Hispanics as white.

As usual, I should caution that I majored in journalism and as a result kept the analysis as simple as I could. You can see my data and R code for yourself, and please e-mail or tweet me if you have questions or comments.

Here's how the ages shake out for those killed by police:

Note that these are raw numbers; they are not adjusted for the fact that there are nearly five times more non-Hispanic whites than blacks in the U.S. So even among older Americans shot by police, blacks are disproportionately represented. But that disproportion is far greater among the young, to the point that blacks outnumber whites significantly.

The same thing is true of homicide offenders, though higher sample sizes make the data look more orderly:

Finally, here's the ratio of those killed by police to homicide offenders by age and race. If anything, the white ratio is higher:

Or, to look at the question differently, the ratio of blacks to non-Hispanic whites is .82:1 for police killings and 1.5:1 for homicides.

Two conclusions here. One, while this certainly doesn't prove police bias is irrelevant, it does call into question how important it is. And two, when it comes to the interaction between age and violence, there are very different patterns for whites and blacks. These patterns should be taken into account when we look at statistics in specific age ranges -- and they might deserve more study in themselves.

[Update: In the first paragraph, I changed "blacks" to "black males" to accurately reflect the calculation ProPublica did. Also, over at Fox News, John Lott has an interesting analysis of the data as well. He highlights some issues with the FBI's statistics, and reports that a criminologist ProPublica quoted had asked not to be quoted. ProPublica disputes what the criminologist told Lott.]

Robert VerBruggen is editor of RealClearPolicy. Twitter: @RAVerBruggen

Restarting Nuclear Power

Derrick Freeman - October 20, 2014

Japanese prime minister Shinzo Abe has endorsed the restart of the nuclear reactors at Kyushu Electric's Sendai power plant in southern Japan. This is a remarkable turnaround -- considering all of Japan's 48 nuclear reactors were shut down after an earthquake and tsunami led to meltdowns at the Fukushima Daiichi plant in March 2011 -- with lessons for policymakers here in the U.S.

Japan's decision comes after its newly formed Nuclear Regulatory Authority (NRA) issued a more than 400-page safety report in July showing that Kyushu Electric's safety assessment met the new regulatory standards, followed by a month-long public comment period. The NRA replaced the patchwork of bureaucrats who had been responsible for oversight of the nuclear industry before the accident and is widely viewed as much stricter than the previous regime. The Japanese government has worked hard to restore confidence in the nuclear-power industry, for example by requiring all communities within 30 kilometers of a plant to submit evacuation plans for approval.

Tokyo's green light, however, is the first of many hurdles the nuclear industry must clear to get back on line. The next for Kyushu Electric is to gain consent from the governor of Kagoshima Prefecture and the mayor of Satsumasendai, where the plant is located.

Interestingly, support for the restart of Japan's nuclear reactors runs highest in the communities that host them. Overall, however, the public remains wary. Before the accident, nearly two-thirds of the public supported building new nuclear reactors; a national public poll taken in July and published by the Asahi newspapers found 59 percent opposition to the restart at Sendai.

Nonetheless, there are compelling reasons for Japan to flick the "on" switch for nuclear energy. Since the mothballing of its reactors, the Japanese economy has been in decline, and the country's utility sector has experienced tremendous losses each year. Nuclear power, once 30 percent of Japan's electricity generation, has had to be replaced with other fuels, primarily fossil. Resource-poor Japan has been obliged to depend more on imported natural gas, coal, and oil to meet its electricity needs. This reliance comes at a very heavy price for both Japan and its citizens. Natural-gas prices in Japan hit a record high of $20.125 per million BTU earlier this year, whereas the United States' Henry Hub Natural Gas Spot Price has averaged $4.675 through August 2014. Higher import prices for fossil fuels and a weaker yen have led to Japan running trade deficits for the first time in three decades.

Imported fossil fuels generated 88 percent of Japan's electricity last year, compared with 62 percent in 2010, according to Japan's Ministry of Economy, Trade, and Industry. Household consumers have felt the pinch of these expensive imports as their electricity rates have soared rising 19.4 percent, while industrial users were hit with a 28.4 percent rise.

Along with prices, greenhouse-gas emissions are also soaring. They are up 7.4 percent since fiscal year 2010 despite a decline in manufacturing production, the aggressive implementation of efficiency measures in households, and a significant ramp up in renewable energy. Last November, Japan announced that it would target a 3.8 percent emissions cut by 2020 versus 2005 levels. This amounts to a 3 percent rise from the U.N. benchmark year of 1990, rather than the 25 percent cut Tokyo previously promised to meet its Kyoto Protocol commitment. "Given that none of the nuclear reactors is operating, this was unavoidable," Nobuteru Ishihara, Japan's Environment Minister, has said.

The Sendai plant restart is months away if it happens, and uncertainty still surrounds Japan's energy future. But it appears that, almost four years removed from the devastation at Fukushima, Japan is turning the corner and moving to restore public confidence in its nuclear power infrastructure.

The United States is 35 years removed from Three Mile Island, a much less dangerous accident, yet the anti-nuclear backlash it engendered has yet to abate. However, there are some bright spots in the U.S. nuclear-power sector, with five reactors under construction including the Tennessee Valley Authority's Watts Bar 2 plant, which will be the first unit to come online in America since Watts Bar 1 came online in 1996. Watts Bar 2 should begin operation in the latter part of 2015.

Ironically, Japan appears to be doing a more expeditious job of dispelling public fears in large part by installing a strong, U.S.-style nuclear regulatory system. This suggests that Germany may have acted too hastily in announcing the shutdown of all its nuclear power plants after the Fukushima accident. In any event, let's hope Japan's example inspires the United States to proceed with building new nuclear plants -- and restoring U.S. global leadership in safe nuclear technology.

Derrick Freeman is a senior fellow and director of the Energy Innovation Project at the Progressive Policy Institute.

Search Engines, Free Advertising, and the Content Industry

Mytheos Holt, R Street - October 17, 2014

In 1925, a music industry professional complained to the New York Times that the new medium of radio was destroying the industry's business model by making songs too widely available to the public for free. The Times quoted the unnamed professional saying:

The public will not buy songs it can hear almost at will by a brief manipulation of the radio dials. If we could control completely the broadcasting of our compositions we would endeavor to prevent this saturation of the radio listeners with any particular song. . . .

We are striving to have the copyright law, which protects us from the free use of our compositions by the makers of phonograph records and music rolls, construed to cover broadcasting. The law specifies that we must be compensated if any of our songs are used by some one else for profit to them. We contend that the radio station is an enterprise founded for gain. It is not controlled by those purveying sets or parts, it is employed by organizations which use it as a medium of institutional advertising.

The music industry professional got his wish as far as copyright, and has turned out to be right in another way as well, though surely not in a way he would have expected. Radio is treated as free advertising -- and primarily for music producers! This is, in fact, the main reason why terrestrial radio stations are presently statutorily exempt from paying royalties.

Today, the story of radio's transition from content industry bete noir to one of its core advertisers is being replayed in the case of another medium that content industry professionals once lambasted as nothing but a gateway for pirates: search engines,

In a report released today by Google, the company lays out the case that search engines aren't major driver of piracy.The report claims that search is responsible for just 16 percent of the traffic to sites that host pirated content. By contrast, studies have shown that 64 percent of the traffic to legitimate sites comes from search engines.

To take one example, "katy perry" gets searched for 200,000 more times on Google than "free katy perry mp3." What's more, under new changes to the company's search algorithm, legitimate sources of Katy Perry's music will show up first on both searches, at no cost to Perry herself (though individual content salesmen such as Apple, Amazon or Spotify can pay to have their digital storefronts advertised prominently).

Starting in 2012, Google began "downranking" sites that receive a high volume of Digital Millennium Copyright Act take-down complaints, meaning that those results automatically are ranked lower in Google's search algorithm. The new tweaks will go further to prioritize results forlegitimate sources of film, music and other copyrighted content, as well as offering users multiple sources from which that content can be purchased, rented or streamed. This would apply even for obvious piracy-oriented searches, such as "the big lebowski torrent."

In short, for content producers, search engines serve as a form of free advertising, paid for either by middlemen online retailers, or by the search engine itself. As the Google report puts it:

Piracy often arises when consumer demand goes unmet by legitimate supply. As services ranging from Netflix to Spotify to iTunes have demonstrated, the best way to combat piracy is with better and more convenient legitimate services. The right combination of price, convenience and inventory will do far more to reduce piracy than enforcement can.

Consumers have a huge appetite for content, and there's evidence that they're willing to pay a lot for it. A report from May 2013 found that the most frequent consumers of pirated digital files actually spend 300 percent more on content than so-called "honest" consumers. This tendency for piracy itself to serve as a form of free advertising is why savvy media producers, such as the makers of HBO's Game of Thrones, find high piracy rates flattering, rather than alarming. Once HBO's new stand-alone streaming service launches, these users of pirated content easily could turn into legitimate consumers.

Search engines thus have a huge opportunity to exploit a market with an above-average appetite for content and expose it to more ways to purchase that content. This benefits search engines like Google, but it also benefits the content industry itself.

Of course, as the 1925 Times quote demonstrates, the content industry hasn't always been eager to embrace innovation. The disruptive effects of the Internet have shaken a lot of established content industry business models, which has led some of those industries into efforts at outright censorship, both through abuse of the DMCA's take-down system and through attempts to bake censorship into the Internet itself, via new legislation and trade agreements.

Google's report provides some truly lurid examples of what that "abuse" looks like, such as a film company allegedly trying to get a major newspaper's review of their film blocked in search results. Techdirt has also outlined some truly ridiculous examples of DMCA takedown requests. In view of these shameless attempts at censorship, Google's decision to protect against DMCA abuse from both directions is prudent. It remains to be seen whether these safeguards will continue to hold, but the proliferation of information about fair use on the Internet suggests reason for optimism.

Distinguishing between fighting piracy as an industry, and fighting individual pirates, who are rarely the hardened criminals that content industry advocates paint them as, could be a major step toward a better Internet both for consumers and producers.

This piece originally appeared at the R Street Institute's blog. Mytheos Holt is an R Street associate fellow.

The TPP Can Promote Medical Innovation

Eric V. Schlecht - October 17, 2014

This month, international trade representatives will head to Australia to continue negotiating the Trans-Pacific Partnership (TPP). Many believe this process is close to completion. A finalized TPP holds real hope for boosting global commerce and driving sustained economic growth for all countries involved.

It also has the potential to fuel medical innovation and bolster public health long into the future. But in order to realize that goal, some critical details need to be ironed out -- namely, the deal must ensure greater transparency and efficiency in how medicines make their way to patients in TPP markets. Strong provisions in the TPP will ensure that American companies are treated fairly when they sell their innovations abroad, and that patients in TPP countries have access to important medical treatments.

To their credit, U.S. negotiators seem keen to include these measures in the final deal. But they've recently come under attack from powerful U.S. interest groups that want to weaken protections for American companies. This would be a grave mistake.

The U.S. leads the world in the development of new drugs. We've produced more than 400 approved medicines since the turn of the century, and we've put another 5,000 medicines into clinical trials around the globe. These advances don't come easy. Our pharmaceutical industry invests more than $50 billion in research and development each year -- more than any other country by far -- to continue churning out medical advancements.

And of course, these breakthroughs aren't just reserved for American patients. They're used worldwide to improve -- and save -- countless lives. But when these treatments become available beyond our shores, they are often prevented from making their way to patients due to inefficient, slow, and complicated government approval, pricing, and reimbursement systems.

New Zealand is one of the major economies participating in the TPP, and it demonstrates the risks negotiators will be running if the proper protocols aren't required. The country's public-health system, PHARMAC, is charged with drug regulatory approvals and pricing and reimbursement policies. Because it was made an independent body in 2001, PHARMAC lacks basic transparency and accountability measures for its decisions.

It routinely denies foreign companies the most basic considerations -- such as guidelines for registering new drugs -- and reasonable timelines for approval or reimbursement decisions.

This has effectively denied patients and doctors the opportunity to weigh in on decisions that affect public health -- and as a result, these decisions tend to focus narrowly on cost at the expense of access to new medicines. One analysis found that of 83 prescription medicines registered with neighboring Australia between 2000 and 2006, only 22 were reimbursed in New Zealand. Overall, there is an average lapse of three years between the time a drug is approved in the first country and in New Zealand. This is surely part of the reason why New Zealand ranks a dismal 14th out of 19 OECD countries in terms of the annual number of patient deaths from treatable conditions.

The TPP must require that these decisions no longer be made in a vacuum. Governments' systems for pricing and reimbursing medicines for the purpose of public programs should be transparent, timely, and predictable. Drug makers should be allowed to appeal rate and approval decisions to an independent administrative body. And all decisions need to be substantiated by the latest science. Failing to provide these basic protections to drug companies severely limits their ability to research and develop drugs.

These are essentially the provisions enshrined in KORUS, America's landmark free-trade deal with South Korea, which came into force in 2012. This agreement was created with the goals of preventing arbitrary decisions, preserving patient access to a wide array of high-tech medicines, and retaining the incentives for future innovation.

A cadre of high-profile U.S. interest groups has mounted a concerted effort against including such strong provisions in the new TPP deal. This campaign includes the AARP, the public-employee union AFSCME, and the private-sector AFL-CIO and SEIU labor organizations. They've all repeatedly petitioned federal trade officials to back down from including these important protections in the deal.

They're worried that these provisions would somehow open the door for drug manufacturers here in America to challenge the payment policies of Medicare, Medicaid, and other public insurance programs. That fear is baseless.

In a letter reported by Inside U.S. Trade, the U.S. trade representative explicitly addressed these concerns: "These are straightforward provisions that will not require any changes to any U.S. healthcare laws nor will they affect the U.S. Government's ability to pursue the best healthcare policy for its citizens, including future reforms or decisions on healthcare expenditures."

Unlike many other nations, the U.S. government does not dictate drug prices through its government insurance programs. Reimbursement rates are, by design, largely tied to prices on the open market. In America, government health-care programs are intended for specific segments of the population, and the prices under those programs are tied to reported prices based on commercial sales in competitive markets.

In fact, Medicare Part D is a "best practice" example of this: It embraces private competition, not government price controls; it's completely delivered through private plans; and the government pays for plans based on competitive bids.

The TPP should include the same strong transparency provisions that were inscribed in the U.S. trade agreement with South Korea. This will not only ensure that citizens in TPP partner countries have timely access to safe, effective medicines, but also that U.S. companies can continue to produce them long into the future.

Eric V. Schlecht is a writer who has worked on budget and economic issues in Washington, D.C., for more than 20 years. He has served in leadership offices in both the U.S. Senate and House of Representatives.

Does Eminent Domain Even Raise Revenue?

Dean Stansel & Carrie Kerekes - October 17, 2014

Proponents of eminent domain for private development -- i.e., of forcibly taking private property and giving it to another private party -- claim it will generate more revenue for state and local governments. The Supreme Court even based its landmark 2005 case Kelo v. City of New London on this assertion, holding that the alleged economic benefits for communities legally justify these takings as "public use."

The claim that eminent domain leads to higher revenues has largely gone unchallenged. We recently examined the available data, and our study finds virtually no evidence that eminent-domain activity for private development is associated with higher government revenue. To the contrary, we find some evidence that eminent domain is associated with lower growth of government revenue in the future.

In other words, governments' primary justification for taking property from private owners like Susette Kelo and transferring ownership to big companies like Pfizer is based on faulty assumptions. In fact, the redevelopment plan for which Ms. Kelo's house (and those of her neighbors in New London, Conn.) was taken never happened. The land was actually used as a temporary dump for storm debris in the aftermath of Hurricane Irene in 2011.

Confiscating someone's home or business and using the land as a dump is an egregious property-rights violation. Even if eminent domain for private development did achieve the objective of producing higher revenues for state and local governments, it would be an abhorrent activity. However, it also has serious negative implications for the future economic prosperity of the community.

Private-property rights are the foundation of a successful market economy. Any encroachments on private-property rights -- like eminent domain -- hamper economic growth and result in lower standards of living than we would otherwise enjoy.

For example, in countries like Cuba and North Korea, where private-property rights are very insecure, entrepreneurs are less willing to invest in the new machines and equipment they need to expand their businesses. Individuals in these countries have a reasonable expectation that any machinery or equipment, or overall business or land itself, may at some point be taken from them by government predation or by individual criminals.

Fortunately, property rights in the United States are relatively secure -- but things are heading in the wrong direction. The Fraser Institute publishes an annual index that ranks countries according to their economic freedom using data in five areas: size of government, legal system and property rights, sound money, freedom to trade internationally, and regulation. In the recently released 2014 Economic Freedom of the World report, the United States fell to 12th, down from the second spot in 2000 and the seventh spot in 2008. In the area of "legal system and property rights," the United States fell all the way to 36th.

Our study's findings confirm that policymakers and the public are right to be skeptical of attempts to justify the seizure of private property with the promise of future financial windfalls. In reality, these encroachments may hamper economic growth and lead to lower standards of living for more than just those who have lost their homes or businesses.

Dean Stansel and Carrie Kerekes are economics professors at Florida Gulf Coast University and authors of a new study entitled "Takings and Tax Revenue: Fiscal Impacts of Eminent Domain," published by the Mercatus Center at George Mason University.

When 'Niceness' Becomes Tyranny

Thomas K. Lindsay - October 16, 2014

Take this quiz. In which of the following venues -- (a) The New York Times or (b) Fox News -- did the following report appear? "Teacher[s] . . . [are] frightened of the pupils and fawn on them." The "students make light of their teachers. . . . And, generally, the young copy their elders and compete with them in speeches and deeds, while the old come down to the level of the young; imitating the young, they are overflowing with . . . charm, and that's so that they won't seem to be unpleasant or despotic."

The answer is "none of the above." This account is nearly 2,500 years old, coming from Socrates in Plato's Republic, and is part of an analysis of how democratic freedom, taken to its extreme, can culminate in collective tyranny.

What has been the effect or our Niceness Crusade on today's children? A New Yorker piece blames parents for creating young children who are, as the article's title states it, "Spoiled Rotten." ("Why do kids rule the roost?" asks the subheadline.) What kind of college students do such children then grow up to become? A recent New Republic article by a former Yale professor worries that today's students at elite universities are "entitled little sh[**s]." Another study, of Bowdoin College students, conducted by the National Association of Scholars, finds these students guilty of "knowingness," which is "the antithesis of humility," the "enemy of education," and "a formula for intellectual complacency." Contrast this with Socrates' famous formulation, which served for centuries as liberal education's animating principle: "The unexamined life is not worth living for a human being."

Why might today's parents fail to exercise the leadership necessary to enforce the discipline necessary to their children's maturation? How have the relations between the young and old been turned upside down, with the older, more experienced generation now fearing to offend the younger, less-experienced generation, rather than vice versa?

Doubtless, a variety of factors are at play here, but, for Socrates, democratic justice, which he finds to be the principle of freedom, degenerates -- as do all political principles -- through being taken to its extreme. Liberty, which in the highest sense consists in freely choosing to restrain one's passions in order to pursue the just course, degenerates into license, which is liberty unrestrained by any purposes higher than freedom itself; that is, license is irresponsible freedom.

So unquenchable can become democracy's passion for freedom-as-unrestraint, argues Socrates, that, by virtue of its logic, it extends freedom ever wider, eventually to the animals themselves: "There come to be horses and donkeys who have gotten the habit of making their way quite freely and solemnly, bumping into whomever they happen to meet on the roads, if he doesn't stand aside, and all else is similarly full of freedom." This fantastic scenario is meant intentionally to be dreamlike, but it resonates with us today when we consider the principles animating the animal-rights movement. Some recall the saga of the ill-fated "Baby Fae," a newborn whose heart condition led doctors to take the desperate, ultimately unsuccessful, measure of transplanting a baboon's heart to her in hopes of saving her life. This produced outcries from the animal-rights movement, which critiqued the morality, as one scholarly paper puts it, "of the taking of an innocent animal's life to attempt to save the life of an innocent human." Similar human-animal equations appear regularly from PETA, such as its "Holocaust on Your Plate" campaign.

The political consequences of taking freedom to its extremes, according to Socrates, are that democracy's citizens "end up . . . by paying no attention to the laws, written or unwritten, in order that they may avoid having any master at all." Under the new dispensation, then, the old come down the level of the young; parents, to their children; human beings, to animals; and -- thanks to today's popularization of moral relativism -- objective Truth falls to subjective choice. All this in order that all may be fully "free."

How might we rediscover a sound basis for teaching and practicing self-restraint? We could start by reading Plato, who teaches that, while it is one sense natural for us to "want what we want, when we want it," human nature at its deepest longs for something more, something higher. We long to discover and participate in a good of such nobility that it trumps our lower desires.

There was a time, of course, not so long ago, when many college students could be expected to study (because it was required) Plato's Republic. But those bad old days of making students read things they might not want to read, such as difficult Platonic dialogues, surrendered to the same passion that Plato finds threatens to transform democratic liberty into tyrannical license. In the name of "student choice," our university elders, not wanting to seem "unpleasant or despotic," abandoned required core curriculums a half-century ago, replacing them with their intellectually spineless shadows -- "general education" and "distribution requirements."

It is difficult to envision today's universities restoring the type of rigorous, required core curriculum in which students would be compelled to encounter a text like the Republic, through which they might receive the greatest gift of all -- coming better to understand themselves and what they believe through engaging in a serious conversation with a mind greater than their own who challenges them to examine their unexamined assumptions. But if universities do not take this courageous step, they doom their students to lives suffocated by prejudice, by the "knowingness" and sense of entitlement that is death to intellectual as well as political liberty. If American higher education, which has come so much in our increasingly secular society to be the chief crafter of the culture, fails to seek to arrest the degeneration for which it is in some part culpable, Socrates would argue that we can next expect a culture in which "insolence" will come to be labeled "good education; anarchy, freedom; wastefulness, magnificence; and shamelessness, courage."

From this shift in the culture, Socrates concludes, comes "the beginning, so fair and heady, from which tyranny . . . naturally grows." Which leaves a question for today's parents and educators: Are the benefits of our aimless "niceness" toward our children worth the price? If not, then for their sake, as well as ours, we adults might consider acting again like grown-ups.

Thomas K. Lindsay directs the Center for Higher Education at the Texas Public Policy Foundation and is editor of He was deputy chairman of the National Endowment for the Humanities under George W. Bush. He recently published Investigating American Democracy with Gary D. Glenn (Oxford University Press).

Jacob Anbinder - October 16, 2014

New York's transit system is the lifeblood of America's largest metropolis. Comprising subways, commuter trains, and ferries, it's famously vast, and millions of commuters rely upon it every day.

Salt Lake City's is . . . not. Utah's capital features light rail, bus rapid transit, and even a commuter rail line. But these transportation options -- many of which are less than a decade old -- are hardly integral parts of the city's identity.

But not so fast, spoiled New Yorkers. Salt Lake's relatively modest transit network actually outperforms its New York counterpart on one essential measure: providing access to a high percentage of the region's jobs. That's according to data in the Access Across America report, the latest by transportation planner David Levinson and the Accessibility Observatory at the University of Minnesota.

The study, which ranked urban areas by their transit systems' ability to provide access to jobs, revealed some surprising truths about the impact of mass transit on urban mobility.

1. Percentages tell a different story.

Levinson and his colleagues ranked 46 of the largest American Metropolitan Statistical Areas (a definition used by the Census Bureau) by the number of transit-accessible jobs in the urban area in six different increments of time -- from ten minutes to one hour. The calculations were "worker-weighted," meaning they accounted for the actual residential patterns of each city.

In terms of raw numbers, New York was the victor in every time segment, with 1.2 million jobs reachable in one hour. In fact, there was an incredible amount of consistency among a core group of cities -- New York, Chicago, San Francisco, and Washington, D.C., all finished in the top five every time. That should come as little surprise, given that those four cities are among the densest and most transit-reliant in the country. Conversely, Birmingham, Ala., and Riverside, Calif., were in or near the bottom of the list every time.

Ranked by percentage of jobs accessible through transit, however, the list tells a different story. New York still does pretty well (ranking sixth), and San Francisco still makes an appearance at number three, but Salt Lake City takes first, with San Jose, Milwaukee, and Denver rounding out the top five. (Sorry, Riverside -- you're still last.)

Salt Lake's first-place finish is well deserved -- a one-hour commute on mass transit puts Salt Lakers in reach of an incredible 25.42 percent of the region's jobs. New York? Just below 15 percent.

2. It pays to think regionally.

Even more amazing than Salt Lake's ranking in the study, however, is the fact that those 129,000 transit-accessible jobs can be reached using just one agency: the Utah Transit Authority. In achieving this, Salt Lake was not alone. Of the five top-ranked urban areas in the one-hour category (ranked by percentage of accessible jobs), three have just one major transit agency.

It may seem counterintuitive, but it's a fact that speaks to the power of strong regional planning on issues of transportation, especially when addressing longer commutes, which are more likely to cross jurisdictional boundaries. Unlike Riverside, say, where the transit systems are separated by county lines, Salt Lake, Milwaukee, and Buffalo can better coordinate their bus schedules to optimize commutes. A unified transit system is not a panacea -- poor-performing Birmingham also has just one -- but it can go a long way toward improving regional job accessibility.

3. Population density doesn't play as big a role as you might think . . .

You would think that population density and job accessibility go hand-in-hand. In denser cities, not only would one expect a greater concentration of jobs, but also a more substantial mass-transit network to bring people to them. Yet, as the graph below shows, the statistical relationship between the two is limited at best. (The same is true if you measure job accessibility in raw numbers.)

About 8 percent of jobs are transit-accessible within an hour in the Los Angeles area, for example. That's roughly the same as greater Cleveland, which is only one-third as dense.

4. . . . but investment in heavy rail does.

Population density is, of course, just part of the equation. The kind of transportation used to commute also has a major impact on job accessibility. Looking at the raw number of accessible jobs, the urban areas that "outperform" their counterparts of similar densities tend to have established, extensive heavy-rail systems.

The Boston area and greater Kansas City have roughly the same population density, for example, but an hour on mass transit will take a commuter up to five times more jobs in the former than in the latter. The Washington, D.C., area has a population density below that of New Orleans, but its subway-centric transit network can reach ten times more jobs in one hour.

Interestingly, though, when looking at job accessibility as a percentage rather than as a raw number, heavy rail is not always the reason for a city's high ranking. Heavy-rail-reliant San Francisco and New York still have an edge, but Milwaukee, Buffalo, Portland, and Denver, which lead their peers of comparable density, rank high despite having no such advantage.

5. People like to stick to their cars . . .

Perhaps the most interesting element of the study, however, is the relationship between the ideal situation it describes and Americans' actual commuting habits. In reality, most American cities have only a very meager number of mass-transit commuters, outside of a few large metropolises with well-established rail networks. This remains the case even when the city in question has a high proportion of transit-accessible jobs.

Despite Salt Lake City's progress in making jobs accessible via public transportation, for example, less than 5 percent of the region's commuters use mass transit to get to work. Same goes for San Jose, ranked number two percentage-wise.


6. . . . unless everybody is taking the train.

Looking at the raw numbers of transit-accessible jobs, however, the situation is quite different.

Here, there appears to be a strong correlation -- at least preliminarily -- between the total number of transit-accessible jobs in a given city and the percentage of that city's commuters who use mass transit to get to work. This is the case in both in the 30-minute and 60-minute categories.

It will take a far more rigorous statistical analysis to determine if there truly is a relationship between the two measures. (For starters, the correlation is weaker when cities with very low transit usage are examined as a discrete group.) Still, it seems logical to assume that a virtuous circle exists between the sheer number of transit-accessible jobs and the proportion of people who use transit to travel to them.

When transit agencies focus on moving large numbers of passengers, the frequency of trains and buses usually increases. This, in turn, improves the public's perception of transit's reliability, which encourages greater ridership. Employers, eager to have access to a broader pool of talent, continue to locate their companies close to this well-used transit network.

7. If you build it, they still might not come.

Still, the report is perhaps most useful in showing that there's no magic formula for good transit investment.Salt Lake City is the perfect case in point. Thanks to regional planning that takes into account the entire Wasatch Front, the Utah Transit Authority has established a reputation as a national leader in smart mass-transit investment. Still, the city's low rates of transit ridership show that more work is needed to effect lasting change in Salt Lakers' commuting habits.

New York City faces a different problem. Its residents don't need to be encouraged to use the subways and buses, but a profound lack of regional cooperation has for years harmed the city's potential for growth. New York's impressive statistic isn't that 1.2 million jobs are transit-accessible -- it's that the city managed to achieve that number despite having no meaningful coordination between New Jersey Transit, the Port Authority, and the various agencies of the MTA.

For some cities, success is driven by delegating transit-planning powers to a regional body that can predict the needs of the entire urban area and plan its investments accordingly. For others, it's the existence of a local transportation culture in which people consider mass transit the default option and their car the alternative, rather than vice versa.

The ideal, of course, is to find the place where those two trends intersect. But as the report shows, that ideal may be for now the most elusive destination of all.

Jacob Anbinder is a policy associate at the Century Foundation, the New York-based think tank, where he writes about transportation, infrastructure, and urban policy. All graphic data is from the American Community Survey 2013 1 Year Estimates (using "urbanized area" as the geographic category), and "Access Across America: Transit 2014," prepared by Andrew Owen and David Levinson for the Accessibility Observatory at the University of Minnesota

Bringing Competition to Internet Service

Joshua Breitbart - October 15, 2014

If you live in an American city, chances are you're getting a raw deal -- paying more for broadband, and yet getting slower service, than your urban counterparts around the world. Part of the reason is that the urban broadband market in the U.S. is effectively a duopoly, as the chairman of the Federal Communications Commission (FCC) noted last month. Without competition, there's less incentive for Internet Service Providers (ISPs) to increase speeds, improve service, or slash prices. What's a city to do?

Bigger cities in the U.S. have relied almost exclusively on private companies to deliver broadband to residents, but the shine has come off this apple in recent years. Early on, the phone and cable companies leveraged their existing wires to squelch competition and dominate the broadband market. In the mid-2000s, cities like Philadelphia and San Francisco hoped companies like EarthLink or MetroFi would deliver citywide Wi-Fi to disrupt the ISP duopoly, but they didn't. Verizon and AT&T have rolled out some fiber-optic upgrades, but they have passed over many cities and neighborhoods. As a sign of desperation, one mayor threw himself in a freezing lake in a failed attempt to get Google to build a fiber-optic network in his town.

Taking the opposite approach, nearly 400 local governments have chosen to become public ISPs, according to the Institute for Local Self-Reliance. If the FCC strikes down a series of state bans on municipal broadband, hundreds more cities may pursue this model, but it is unlikely this solution will work for major metros where two companies have already built networks and acquired customers. The cities to try this route so far have generally been ones that the national ISPs have passed over; the largest cities with municipal networks are Chattanooga, Tenn., and Lafayette, La., with populations of roughly 170,000 and 125,000 respectively. Many smaller cities will not have the technical capacity or political will to take this leap.

Instead of getting caught between views of broadband as a wholly public utility or as a totally private amenity, big cities need to cultivate private-sector, non-profit, and cooperative broadband solutions neighborhood by neighborhood. The key is providing an "open access" network -- infrastructure that multiple service providers can use without each having to invest in their own citywide network. Cities can piece a network like this together the way they accumulate park land and affordable housing: through requirements on private developers and strategic use of public assets.

The citywide open-access network connects to the Internet backbone, then to key points in neighborhoods, like our libraries, firehouses, schools, and media centers. (Many cities already operate "institutional networks" that connect these community anchors, but they cannot use them to deliver Internet service per agreements with the cable companies.) The network doesn't offer Internet service, merely the opportunity to move data from one point in the city to another point at very low cost. Whether the data is heading to or coming from the Internet isn't the city's concern.

The price of Internet bandwidth varies widely across a city, as does the possible speed. Right now, it's only universities, major financial corporations, some hospitals, and Big Internet that get access to the speed and volume pricing of the backbone. It should be more like getting a street vendor license or a hack license, and open to that level of entrepreneurial effort. And those top speeds should not be available only in a central business district, but also in at least one spot in every neighborhood.

Cities can build these networks piece by piece, using "dig once" policies that coordinate infrastructure projects. If you are going to dig up the streets or lay new pipes for any purpose, the city should also install fiber-optic lines and conduits for future lines. As Columbia Telecommunications Corporation describes in their "Gigabit Communities" report, even if the local government doesn't make immediate use of these assets, they can potentially lease access to private providers, lowering a company's construction costs and minimizing disruptions for residents. Cities can be more aggressive in expanding the network with broadband-related requirements on new development, such as as rights of way for rooftop wireless links or a mandated fiber-optic tie-in, as we might require a developer to connect to sewage and water systems.

Any efforts to streamline construction or add zoning requirements should not be at the expense of due process, however. Local policymakers -- even community boards and block captains -- need a basic literacy in broadband deployment to serve their appropriate function of oversight and public participation.

While cities can hope to connect every neighborhood, they need to target their effort where it is needed most or can do the most good. They can divide the city into more manageable-sized markets for issuing franchises to access city light poles, streets, and sewers. Google Fiber divided Kansas City into "fiberhoods" where a critical mass of committed subscribers would determine if the company would build to that area. Only later did they realize the level of outreach and organizing needed to promote broadband in chronically underserved areas. Cities can be more proactive, designating underserved areas as "broadband enterprise zones" where traditional economic development incentives such as tax breaks or loans help ISPs start or expand service. (My colleagues and I have proposed a methodology for identifying these zones and a model policy framework can be adapted from the Center for Social Inclusion's concept of an "Energy Investment District.") Low-income areas are already targeted for digital literacy programs, and occasionally for reduced service rates or other subsidies, but usually with the mistaken idea that the current service options are sufficient.

Even with no local government support, entrepreneurial providers like WasabiNet in St. Louis and BKFiber in Brooklyn are taking advantage of new wireless networking technologies to compete for customers in neighborhoods that have been chronically underserved. Community-based organizations like Red Hook Initiative in Brooklyn and Allied Media Projects in Detroit are also constructing neighborhood-scale wireless networks, using free software and teaching tools developed with the Open Technology Institute, where I work. Instead of public property, these organizers build on their relationships with residents, churches and other partners to install equipment.

Red Hook Initiative's project has boosted BKFiber by becoming a paying customer, raising the company's profile in the community, and helping it gain access to various rooftops to place equipment. Community-based, sometimes rather informal projects face considerable organizational and regulatory challenges, but they are increasingly within reach for neighborhood associations or cooperatives wishing to sponsor hotspots, develop a resilient emergency communication system, or share connections to the Internet. Cities that wanted to see more of these projects could fund them as education or job training and provide access to city property, so long as the process for doing so was transparent and continued to incorporate community participation.

Broadband is an essential service. Municipal governments have both a moral obligation and an economic motivation to connect all residents, but the ways for them to do this will vary from town to town. Not all governments will become Internet service providers, but all should take an active role in ensuring a vibrant and competitive broadband marketplace for their residents. None can rest while the current duopoly remains in place.

Joshua Breitbart is a senior research fellow with New America's Open Technology Institute.

Ryan Gabrielson, Ryann Grochowski Jones & Eric Sagara, ProPublica - October 14, 2014

Young black males in recent years were at a far greater risk of being shot dead by police than their white counterparts -- 21 times greater i, according to a ProPublica analysis of federally collected data on fatal police shootings.

The 1,217 deadly police shootings from 2010 to 2012 captured in the federal data show that blacks, age 15 to 19, were killed at a rate of 31.17 per million, while just 1.47 per million white males in that age range died at the hands of police.

One way of appreciating that stark disparity, ProPublica's analysis shows, is to calculate how many more whites over those three years would have had to have been killed for them to have been at equal risk. The number is jarring -- 185, more than one per week.

ProPublica's risk analysis on young males killed by police certainly seems to support what has been an article of faith in the African American community for decades: Blacks are being killed at disturbing rates when set against the rest of the American population.

Our examination involved detailed accounts of more than 12,000 police homicides stretching from 1980 to 2012 contained in the FBI's Supplementary Homicide Report. The data, annually self-reported by hundreds of police departments across the country, confirms some assumptions, runs counter to others, and adds nuance to a wide range of questions about the use of deadly police force.

Colin Loftin, University at Albany professor and co-director of the Violence Research Group, said the FBI data is a minimum count of homicides by police, and that it is impossible to precisely measure what puts people at risk of homicide by police without more and better records. Still, what the data shows about the race of victims and officers, and the circumstances of killings, are "certainly relevant," Loftin said.

"No question, there are all kinds of racial disparities across our criminal justice system," he said. "This is one example."

The FBI's data has appeared in news accounts over the years, and surfaced again with the August killing of Michael Brown in Ferguson, Missouri. To a great degree, observers and experts lamented the limited nature of the FBI's reports. Their shortcomings are inarguable.

The data, for instance, is terribly incomplete. Vast numbers of the country's 17,000 police departments don't file fatal police shooting reports at all, and many have filed reports for some years but not others. Florida departments haven't filed reports since 1997 and New York City last reported in 2007. Information contained in the individual reports can also be flawed. Still, lots of the reporting police departments are in larger cities, and at least 1000 police departments filed a report or reports over the 33 years.

There is, then, value in what the data can show while accepting, and accounting for, its limitations. Indeed, while the absolute numbers are problematic, a comparison between white and black victims shows important trends. Our analysis included dividing the number of people of each race killed by police by the number of people of that race living in the country at the time, to produce two different rates: the risk of getting killed by police if you are white and if you are black.

David Klinger, a University of Missouri-St. Louis professor and expert on police use of deadly force, said racial disparities in the data could result from "measurement error," meaning that the unreported killings could alter ProPublica's findings.

However, he said the disparity between black and white teenage boys is so wide, "I doubt the measurement error would account for that."

ProPublica spent weeks digging into the many rich categories of information the reports hold: the race of the officers involved; the circumstances cited for the use of deadly force; the age of those killed.

Who Gets Killed?

The finding that young black men are 21 times as likely as their white peers to be killed by police is drawn from reports filed for the years 2010 to 2012, the three most recent years for which FBI numbers are available.

The black boys killed can be disturbingly young. There were 41 teens 14 years or younger reported killed by police from 1980 to 2012 ii. 27 of them were black iii; 8 were white iv; 4 were Hispanic v and 1 was Asian vi.

That's not to say officers weren't killing white people. Indeed, some 44 percent of all those killed by police across the 33 years were white.

White or black, though, those slain by police tended to be roughly the same age. The average age of blacks killed by police was 30. The average age of whites was 35.

Who is killing all those black men and boys?

Mostly white officers. But in hundreds of instances, black officers, too. Black officers account for a little more than 10 percent of all fatal police shootings. Of those they kill, though, 78 percent were black.

White officers, given their great numbers in so many of the country's police departments, are well represented in all categories of police killings. White officers killed 91 percent of the whites who died at the hands of police. And they were responsible for 68 percent of the people of color killed. Those people of color represented 46 percent of all those killed by white officers.

What were the circumstances surrounding all these fatal encounters?

There were 151 instances in which police noted that teens they had shot dead had been fleeing or resisting arrest at the time of the encounter. 67 percent of those killed in such circumstances were black. That disparity was even starker in the last couple of years: of the 15 teens shot fleeing arrest from 2010 to 2012, 14 were black.

Did police always list the circumstances of the killings? No, actually, there were many deadly shooting where the circumstances were listed as "undetermined." 77 percent of those killed in such instances were black.

Certainly, there were instances where police truly feared for their lives.

Of course, although the data show that police reported that as the cause of their actions in far greater numbers after the 1985 Supreme Court decision that said police could only justify using deadly force if the suspects posed a threat to the officer or others. From 1980 to 1984, "officer under attack" was listed as the cause for 33 percent of the deadly shootings. Twenty years later, looking at data from 2005 to 2009, "officer under attack" was cited in 62 percent xxxvii of police killings.

Does the data include cases where police killed people with something other than a standard service handgun?

Yes, and the Los Angeles Police Department stood out in its use of shotguns. Most police killings involve officers firing handguns xl. But from 1980 to 2012, 714 involved the use of a shotgun xli. The Los Angeles Police Department has a special claim on that category. It accounted for 47 cases xlii in which an officer used a shotgun. The next highest total came from the Dallas Police Department: 14 xliii.

This piece originally appeared at ProPublica, a Pulitzer Prize-winning investigative newsroom. Sign up for their newsletter.


i ProPublica calculated a statistical figure called a risk ratio by dividing the rate of black homicide victims by the rate of white victims. This ratio, commonly used in epidemiology, gives an estimate for how much more at risk black teenagers were to be killed by police officers.Risk ratios can have varying levels of precision, depending on a variety of mathematical factors. In this case, because such shootings are rare from a statistical perspective, a 95 percent confidence interval indicates that black teenagers are at between 10 and 40 times greater risk of being killed by a police officer. The calculation used 2010-2012 population estimates from the U.S. Census Bureau's American Community Survey.







xl Calculated from the "Weapon Used by Offender" variable. Ranked based on frequency of reported shotgun homicides by police agencies.




Airbnb Regulated Into Legality

Ann C. Logue, R Street - October 10, 2014

One of the many oddities of San Francisco is that the city is full of libertarians who love regulation. You can do your own thing, unless you're a tech bro, a landlord or a big corporation, and then you must be legislated into submission. The city's housing market is distorted by a series of regulations that seemed like good ideas at the time, such as rent control and tight zoning. Instead of making the city more charming and affordable, they drew tension between those who can afford housing (the very rich, the long-tenured tenant) and those who don't. The result is a nasty edge to daily life in an otherwise gorgeous city.

The twin pressures of rent control and a booming economy have created occupations that can scarcely be imagined elsewhere, such as the master tenant: this is a person who has a large rent-controlled apartment and who makes a living by subletting rooms at market rate. Sure, the subtenants can complain, but they aren't likely to in a city where the shortage of housing is a serious issue.

Then there's Airbnb. The zoning and construction limits that affect the housing market also affect the hotel market. In 2007, Airbnb was formed in this world of semi-anarchy: a service that allowed people to rent out rooms to visitors. The host received more money per night than he or she would from taking on a roommate. The money offset the very high cost of living in SF, and the visitor saved money on hotel bills.

Win-win? Not quite. With no regulation, participating in Airbnb raised questions: could renters rent out space in their apartments without violating their leases? What if the renter moved in with her boyfriend but kept the rent-controlled lease to make a living as a full-time hotelier? Could landlords kick out tenants in order to rent out apartments to short-term guests? Would the hosts have recourse against crazy, violent or thieving guests -- or squatters? Likewise, would the guests be protected against difficult hosts? And was the city due taxes for the lodging services? If so, should it go after the hosts, the guests or Airbnb itself to collect?

Excessive regulation led to the creation of Airbnb, and less-excessive regulation may just save it. On Oct. 7, the San Francisco Board of Supervisors passed legislation allowing residents to rent out rooms if they register with the city and hold $500,000 in liability insurance. Also, Airbnb must remit lodging taxes to the city. Airbnb is now legal, and guests and hosts alike, at the very least, know where they stood relative to the law.

Regulation is such a complicated beast. It would be nice to say that there should be no regulation whatsoever, but let's face it: some people will behave badly unless they are given limits. On the other hand, too much regulation creates its own issues. Rent control is a bad idea; it is an economic transfer from the landlord to the long-term tenant with no social advantages, as the tenants receive the benefit without regard to need. As with any transfer payment, once it's in place, the beneficiaries form a tight constituency to keep it. No politician has the will to take on an issue like rent control, and there's no time machine to undo it.

On the other hand, there's the very interesting phenomenon of creativity acting in response to constraints. Because regulation creates problems, it creates demand for work-arounds to solve them. Airbnb is one example. Another, also from SF, is Uber: restrictions on the number of taxis meant that people who lived in San Francisco's neighborhoods could not get cabs. The taxi drivers would rather serve tourists than troll for passengers in the Fog Belt. The market for medallions may be limited, but other forms of on-demand transportation solved the problem.

Maybe that's the secret to economic growth in Northern California. We like to think that a high-tax, high-regulation jurisdiction would be a terrible place to do business, but people are flocking to San Francisco and surrounding cities in the hope of hitting it big. The tight regulations force creative thinking to work around them -- and maybe lead to their destruction.

This piece originally appeared on the R Street Institute's blog.

Blog Archives