Sponsored link
Thursday, November 21, 2024

Sponsored link

News + PoliticsHousingUCLA professors respond on housing

UCLA professors respond on housing

Continuing the debate: Does zoning drive up housing costs in cities?

-

Zelda Bronstein, in her December 13 article, calls us hypocrites. Specifically, she says that we said one thing in an article for Urban Studies, but then admitted the opposite was true in a working paper published by UC Berkeley’s Terner Center.

A building on Market Street that was presented as a way to address the housing crisis has instead become a corporate hotel.

Here is how Ms. Bronstein describes it:

 ...even as UCLA Professors Michael Manville, Michael Lens, and Paavo Monkkonen (MLM) were attacking Rodríguez-Pose and Storper (RS) and contending that “[v]iewed in full, the evidence suggests that increasing allowable housing densities is an important part of housing affordability in expensive regions,” they themselves had conceded that such evidence is partial, if not altogether lacking.

 …in February, a month after Urban Studies had accepted for publication MLM’s rejoinder to RS, the Terner Center for Housing Innovation at UC Berkeley published a paper by the UCLA trio, “Built-Out Cities? How California Cities Restrict Housing Production Through Prohibition and Process,” … In the Terner paper, MLM note that “one of the more common approaches to measuring regulation (used in over 20 of the 80 papers [we] reviewed) is to survey planning staff.” They then acknowledge that “[t]he accuracy of these survey-based measures remains an open question….In a study of entitlement processes, O’Neill et al. (2019) show that survey responses are often just wrong” (7).

“And yet, to discern how regulation does or does not influence housing supply, and thus price” (4), MLM draw on just such responses, without explaining how their source, the Terner California Residential Land Use Survey, got it right.”

We reached out to Professor Manville for his response, and we will be happy to run his comments if he decides to answer us.

In brief then, Ms. Bronstein claims we wrote one paper saying that the connection between zoning and regulation was completely established and settled, but another saying there was actually no data to support that claim. We then, in that second paper, ignored our own admission and conducted an analysis with unreliable data.

If all that was true, it would indeed be bad behavior on our part. But almost none of it is true. Ms. Bronstein has misrepresented what we wrote in both articles. In both articles we fully acknowledge the uncertainty and measurement difficulties that confront the study of land use regulation, but go on to show why we believe, despite this uncertainty, that land use regulation plays a substantial role in high housing prices and rents, in California and elsewhere.

Let’s start with the Terner article. In that article we do note, as Ms Bronstein suggests, that surveys of planning staff are a common way to measure regulation. But “common way” is not the same as “the only way.” Ms. Bronstein’s quote from us suggests as much. We reviewed 80 studies, and 20 of them relied on surveys of planning staff. That means 60 didn’t. So even if all surveys of planning staff are wrong, three-quarters of the empirical evidence base remains.

Moreover, as we also discuss in that paper, it isn’t as though these surveys are useless. Respondents are not leaving everything blank, or drawing in pictures of zoo animals instead of answering the questions. The surveys have errors, and biases, and anyone using them needs to account for those problems. But the direction of the error matters. Is it overstating the impact of regulation, or understating it? Ms. Bronstein quotes us saying that a study by O’Neill et al found many survey answers to be incorrect, but tellingly does not quote the very next sentence of our paper, which shows that planning staff may well be understating the degree of their city’s land use stringency, and thus the extent to which regulations impede housing:

They [O’Neill et al] review survey results from eight California cities about the average time to process and approve a development. They then compare these survey estimates with detailed case studies that estimate actual average processing times for different kinds of entitlements. In almost all cases, the responses drastically underestimate the processing times. For example, in Los Angeles and Pasadena responses to the survey reported projects consistent with zoning taking two to six months, whereas projects’ average time was almost a year in Los Angeles and a year and three months in Pasadena.

Ms. Bronstein next contends that, despite our acknowledging the unreliability of surveys of planning staff, we go on to use one in the analytical portion of our paper. That’s true, but it’s far from the whole truth. The main purpose of our paper was to introduce a different metric of regulation—data from Housing Elements—and show how it overcame many of the problems that afflict surveys of planning staff. We compare this new metric to measurements derived from the Terner Center survey. One can argue with our metric, of course, but one can’t say, as Ms. Bronstein does, that we rely exclusively on the Terner Survey. We don’t.

What about our Urban Studies article? To hear Ms. Bronstein tell it, in that article we concealed from readers all the problems that we knew existed with land use surveys, even as, in the Terner paper, we admitted that those problems existed. But that’s not right. The same points we made in the Terner Paper are points we also made in the Urban Studies paper. Here is what we wrote in Urban Studies (remember that “R-S” refers to Rodriguez-Pose and Storper):

“‘Regulation’, as it is used in the housing literature, can encompass a broad array of rules or tactics that impede development, from laws that prohibit apartments, to requirements that make apartment construction burdensome, to planners who slow-walk the permits for apartments that are allowed. Given the many options that localities have for inhibiting development, isolating the role of any given rule or tactic can be difficult, and even efforts to roll many regulations

into an index may not be successful.

Particularly in the early years of the zoning/affordability literature, a lot of ink was spilled on these issues (Calfee et al., 2007; Cheshire and Sheppard, 2004; Gyourko et al., 2008; Quigley and Rosenthal, 2005; Schill, 2005). Many of these problems remain unresolved (Lewis and Marantz, 2019; Monkkonen and Manville, 2020), and it is entirely fair for R-S to remind everyone of that ambiguity.

In short: regulation is hard to measure, people in our field have known as much for a long time, and no one should pretend otherwise. We cannot see how we might be hiding anything in this paper, or whitewashing problems with data. Indeed, if anything we go further in this paper than we do in the Terner paper, since here we suggest not just that surveys of planning staff, but all metrics of regulation, are imperfect.

One might wonder, given these imperfections, why we think regulations play a role in prices. It’s hard to fully answer that question in a short space. Certainly part of the answer is that we find alternative explanations for the housing crisis deeply unconvincing. We encourage readers to examine both our Urban Studies article, as well as the R-S counterpoint, and decide for themselves. In the meantime, we can offer one more quote from our article, to explain how we can simultaneously acknowledge data problems but nevertheless feel confident in a conclusion:

…researchers have, in an attempt to resolve some of these measurement problems, used increasingly varied proxies for regulation. The study of land use regulation would be a dubious enterprise if everyone relied on one metric of regulatory stringency, found the results they wanted, and did their best to ignore that metric’s problems. But that is not the case.

Researchers have measured stringency by examining local zoning codes, surveying planners, surveying developers, looking at the cost and time to get building permits, measuring the difference between the average and marginal value of land, tracking the frequency of development litigation, tracking how often developers request discretionary approvals, measuring the quantity and intensity of opposition at planning hearings, and more. All these measures, when used in controlled statistical models, suggest that regulation suppresses housing production and increases prices (Albouy and Ehrlich, 2011;Einstein et al., 2017; Glaeser and Ward 2008;Glaeser and Ward, 2005; Gyourko et al., 2008; Hilber and Vermeulen, 2016; Jackson, 2016; Kahn et al., 2010; Kok et al., 2014; Levine, 1999; Zabel and Dalton, 2011; Monkkonen et al., 2020).

In other words: no single approach is perfect, but people in our field, for precisely that reason, don’t use one single approach. A method of measuring regulation that overlooks one aspect of the problem can be complemented by a different measure that emphasizes that aspect. The strength of one metric can counterbalance the weakness of another. So it’s true that we can only learn so much from one study, or one method. But we can learn a lot by looking at what many different studies and methods find. If, when we surveyed all the relevant research, we found that different proxies for regulation yielded wildly contradictory results, that would give us pause, and make us think regulation played little role in housing prices. But that’s not what we find. Virtually every proxy for regulation points in the same direction: in high-demand places, regulation suppresses supply and drives up housing prices. We think that tells us something.

To some readers, maybe it still sounds like we are jumping the gun. Perhaps we should only make judgments after we have perfect measurement and definitive proof. But that’s not how science works. Perfection is elusive, and there are few things we can know with 100 percent certainty. Waiting for perfection, moreover, has costs. The fact that something is hard to measure is not a good reason to treat its impact as zero. No one has ever proved, in a controlled experiment, that smoking causes lung cancer in humans, or that human activity causes global warming. But the rich and varied evidence we have in both cases—the many studies using different imperfect methods—adds up to a strong case for action, even if no single study can be called “proof”. 

A final note: while we appreciate the opportunity to respond to Ms. Bronstein, this back-and-forth probably could have been avoided had she simply reached out to us before writing and publishing her piece. She did not, and in not doing so, but writing in a manner suggesting she did, she created the last misrepresentation we’ll mention. Her article strongly implies that she tried to contact one of us before publication, and that we were unreachable or declined to respond. That isn’t the case at all. We were only contacted—and not by Ms Bronstein, but her editor—after her article, and its attack, was online. We do wonder about this order of operations. The housing debate is an important one, and while many people have staked out sides and dug into their positions, we all have an interest in getting at the truth. That task will be easier when we approach each other in good faith.

48 Hills welcomes comments in the form of letters to the editor, which you can submit here. We also invite you to join the conversation on our FacebookTwitter, and Instagram

Featured

The Grammys actually get some things right (and show the Bay a little love)

The often-derided corporate-friendly awards are seeing things a little differently, under the light of a 'New Blue Sun.'

Drama Masks: Surround yourself with family, of whatever kind you choose

Our new stage column reviews 'Jaja's African Hair Braiding,' and 'Lettere d'Amore,' and Left Coast Theatre Co.'s 'Found Family'

A modest suggestion and an open letter: Lurie should hire Aaron Peskin

And Breed, too, why not? Buying off your opponents is a longtime strategy of plutocrats.

More by this author

Sponsored link
Sponsored link

You might also likeRELATED