Donate
  • Freedom
  • Innovation
  • Growth

When the Unelected Rule: Ten Case Studies in Regulatory Abuse

  PDF    PDF 2  

Introduction

By Congressman Joseph Knollenberg (R-MI)

Recently, the Office of Management and Budget candidly admitted that, in the areas of health, safety, and the environment, “it is difficult, if not impossible, to estimate the actual costs and benefits of federal regulations with accuracy. We lack good information about complex interactions between different regulations and the economy.” With federal agencies continuing to issue thousands of regulations each year, OMB’s acknowledgement raises serious questions about the purpose of Washington’s constant meddling.

Good government entails effective oversight. As a member of the House VA-HUD Appropriations Subcommittee that funds the Environmental Protection Agency (EPA), I have the duty of ensuring for the American people that EPA uses the taxpayers’ hard-earned money effectively, that EPA employs sound science, that EPA remains focused on authorized activities, and, ultimately, that EPA respects the Constitution.

I am afforded first-hand the opportunity to become familiar with many of the intricate workings that take place at EPA. In general, I am concerned that, more often than not, the agency fails to rely on proven science in formulating its policies and rules. Too many times, administration policy and management failure get in the way of doing what is right. There are numerous examples — chloroform, MTBE, TMDL, NOx, and a slew of other acronyms that add up to a troubling picture.

TMDL (Total Maximum Daily Load) is a prime example of regulatory overreach, and I am pleased to see it addressed in this report. Cities, counties, and states overwhelmingly oppose the massive imposition of federal bureaucracy that these policies would entail. The cost of fully implementing the TMDL and NPDES (National Pollutant Discharge Elimination System) rules is in the billions of dollars. These rules will affect just about every American, from farmers trying to fertilize their fields, to mining operations, construction sites, power-production utilities, fish hatcheries, and timber operations — just to name a few. Yet despite the objections of a broad cross-section of public opinion and both houses of Congress, EPA has gone ahead and finalized this ill-conceived rule. In the rush to regulate, sound science is being left by the wayside.

Another major concern of mine is the Kyoto Protocol. Under the Kyoto Protocol, the U.S. is slated to reduce emissions of six major greenhouse gases to 7 percent below 1990 levels by 2008-2012. Before the global warming treaty was adopted, the Senate unanimously instructed the administration not to become a signatory to the treaty unless developing countries were required to reduce their greenhouse gas emissions within the same compliance period and no serious harm came to the U.S. economy. Yet developing countries are exempt from the Kyoto Protocol’s mandates, meaning that the American economy will suffer egregiously from the treaty’s restrictions on energy use. Nearly three years have passed since the Kyoto Protocol was adopted, and the administration has still not sent the treaty to the Senate for ratification as required by the Constitution. Indeed, they have no intention of ever sending it.

Notwithstanding the administration’s claim that it is not implementing the Kyoto Protocol, my office has collected dozens of examples that it is doing just that. I’ve pressed the inspectors general of several agencies to ensure that federal funds are not expended prior to Senate ratification. To ensure that agencies do not step over the line, I have had language barring them from doing so inserted into eight of this year’s 13 FY 2001 appropriations bills.

The Knollenberg provision, as it is known, is the result of a bipartisan effort to protect both the Constitution and the taxpayer by restricting any federal spending aimed at implementing the flawed Kyoto Protocol. Furthermore, the exact same language has been signed into law by President Clinton more than seven times since 1998. The language does not prevent scientific research into climate change nor does it restrict in any way the transfer of energy technology to developing countries, where emissions will grow the most in the coming years.

I will continue to fight to ensure that my language remains in each bill and I will continue to closely monitor EPA and other federal agencies to make sure they are not pursuing implementation prior to ratification.

That is why this annual report on the ten worst regulations is so important. The more the public knows about how their lives are adversely affected by the misuse of science, the barriers to technological innovation, and the lack of common sense that are found in so many federal regulations, the sooner people will demand a fundamental reform of this deeply flawed system.


Congressman Joseph Knollenberg (R-Michigan) is a member of the House Committees on Appropriations, Budget, and Standards of Official Conduct.

EPA’s Ill-Conceived Vehicle and Gasoline Standards Will Hurt Consumers and Air Quality

By Susan E. Dudley

In December 1999, the Environmental Protection Agency issued stringent new “Tier 2” vehicle emissions and gasoline regulations which appear certain to restrict the nation’s driving habits. EPA did this despite judicial rebukes and statutory constraints, and despite evidence from its own analysis that the regulatory changes will not significantly improve air quality or public health nationwide, and may actually cause air quality to deteriorate in some parts of the nation.

EPA’s Tier 2 rule: (1) sets stringent new emission standards for passenger cars and light trucks including mini-vans and SUVs, and (2) limits the amount of sulfur in gasoline.

  1. The vehicle standards limit emissions of oxides of nitrogen (NOx) from new vehicles to an average of 0.07 grams per mile (g/mi.), compared to 1999 vehicle emissions standards ranging from 0.30 to 1.53 g/mi. It also limits emissions of nonmethane hydrocarbons (NMHC), carbon monoxide (CO) and particulate matter (PM).
  2. Under the gasoline component of the rule, sulfur in gasoline must be reduced by an order of magnitude, from current average levels of 340 parts per million (ppm) for non-California gasoline to an average of 30 ppm.

Congress, through the Clean Air Act Amendments of 1990 (CAAA), directed EPA to consider tightening vehicle emission standards, no sooner than the 2004 model year, based on: (1) need (are reductions necessary to meet national ambient air quality standards?), (2) the availability of technology, and (3) cost-effectiveness.

However, despite pages and pages of supporting material, and numerous modeled scenarios, EPA has not justified its rule on any of the three criteria required by Congress.

EPA justified the “need” for tighter “Tier 2” emission limits by predicting widespread non-compliance with the stringent new national ambient air-quality standards for ozone (and particulate matter), which the agency issued amid much controversy in 1997. Those proposed standards limited ambient levels of ozone in the atmosphere to 0.08 ppm. The nitrogen oxides and non-methane hydrocarbons emitted from vehicles can combine with sunlight to form ozone under certain conditions.

As also required by the CAAA, EPA determined that new emission standards were both technologically feasible and cost-effective. Although the CAAA directed EPA to consider new standards for vehicles weighing up to 3,750 lbs., EPA’s proposal imposed a uniform standard on vehicles weighing up to 8,500 lbs. It also found that additional controls on the sulfur content of gasoline were necessary to achieve desired vehicle emission reductions.

On May 14, 1999, just one day after EPA proposed these vehicle and gasoline regulations, a three-judge panel of the U.S. Court of Appeals for the District of Columbia struck down the agency’s new ozone and particulate matter standards. In fact, the court said that, in setting the air quality standards, EPA had construed sections of the Clean Air Act “so loosely as to render them unconstitutional delegations of legislative power,” and had ignored offsetting health benefits of ozone in the atmosphere.

Rather than postpone consideration of these rules, however, EPA hurried to justify the new rules based on the pre-existing standards for these two pollutants, rather than on the overturned standards. This appeared to be a difficult task, since most of the country was well on its way to complying with the older standards. In fact, EPA’s own air-quality analysis, prepared for the Tier 2 rule before the court decision, revealed that—with the exception of California (which is exempt from the new rule) and a handful of localized areas around Houston and in the Northeast—the nation will be able to comply with the pre-existing ozone air quality standard without EPA’s draconian measures.

Undaunted, however, EPA produced new modeling statistics in June and October 1999, which contradicted its earlier analysis. The new analysis predicted many more non-attainment areas, and these results were offered as support for the agency’s objective of restricting vehicle emissions of oxides of nitrogen and non-methane hydrocarbons.

Meanwhile, EPA also appealed the Appeals Court decision to the Supreme Court, which, in May 2000, agreed to hear whether EPA displayed an “unconstitutional delegation of legislative power” when crafting the regulations for ozone and particulate matter. The Supreme Court also agreed to consider an industry cross-petition on whether EPA should weigh non-health factors, such as the economic impact of regulations, in setting national air-quality standards.

One might argue that EPA’s manipulation of its statistical models to justify the rule is not necessarily bad. Don’t the ends justify the means, with the ends being substantially improved air quality? In fact, EPA admits that air quality will not improve significantly, and will actually worsen in some parts of the country. An EPA analysis shows that the regulations could actually increase seasonal ozone concentrations in some areas. For the nation as a whole, the agency’s statistical analyses show that the lower vehicle emissions would reduce ozone concentrations by .0004 ppm. In layman’s terms, that’s only a 1.3 percent reduction. This results because, though vehicle emissions can combine with sunlight to form ozone, they don’t do so in a direct fashion, and the resulting ozone levels depend on various manmade and natural factors.

The areas of the country that would experience deteriorating air quality include parts of the Great Lakes region, parts of Texas, New Mexico, Arizona, Southern California, Utah, Washington, Colorado, Southern Florida, and even parts of the Northeast. The Western states will be hardest hit by the costs. EPA data reveal that people there will pay ten times the national average cost per pound of pollutant removed, yet will receive no benefits because they live in areas that already meet the current air-quality standard. According to the analysis EPA relied on before the court decision, the only places in the country that can’t meet the current air-quality standards are California — which would not be included under the Tier 2 regulations anyway — and a handful of localized areas around Houston and in the Northeast.

And of course, tightening the screws on SUVs and getting rid of the last vestiges of sulfur in gasoline doesn’t come cheap. Consumers will cover the costs — through rising vehicle and gas prices — to the tune of between $3.5 and $6 billion per year. By EPA’s understated estimates, consumers will pay hundreds of dollars more per vehicle. Petroleum refiners and car manufacturers suggest the costs will be much higher, and question whether meeting the standards is even feasible given the short lead time allowed by the rule.

The principal focus of the new rule is reducing ozone precursors; yet by EPA’s own estimates, the costs of the proposal far outweigh any benefits EPA attributes to improvements in ozone quality. EPA estimates annual costs of $3.5 billion, and annual benefits ranging from $3.2 billion to $19.5 billion. However, only 17 or 18 percent of EPA’s estimated benefits are due to reduced ozone concentrations. Rather, the quantified benefits of the proposal are dominated by reductions in particulate matter, even though gasoline-powered vehicle emissions, particularly NOx and NMHC emissions, have little effect on fine particulates.

The concern over episodic, localized ozone problems should be addressed not by EPA, but by the states or regional councils, such as the Ozone Transport Assessment Group (OTAG), which have been remarkably successful at designing innovative solutions to their own pollution problems. No matter what EPA claims, what may affect one area may be a non-issue in another.

Americans like clean air, and they’re already getting it. Without EPA’s new initiatives, ozone concentrations have declined by at least 30 percent since 1978. Americans also like to drive their cars, whether it be to the grocery store, to take their kids to school, or to go on a family vacation to the country or the shore. Right now, Americans can still have both. But, with these new Tier 2 regulations, driving might become a luxury only the well-to-do can afford. I


Susan E. Dudley is Senior Research Fellow and Deputy Director of the Regulatory Studies Program at the Mercatus Center at George Mason University. This summary is based on public interest comments submitted to EPA by the Regulatory Studies Program, and does not represent an official position of George Mason University.

TMDL: EPA Muddies the Nation’s Waters

By Bonner R. Cohen

If any single event in recent years can be said to embody the problems besetting federal regulatory policy, it is the manner in which the U.S. Environmental Protection Agency (EPA) set about to “revise, clarify, and strengthen” the nation’s approach to providing for cleaner rivers and streams.

In August 1999, EPA proposed sweeping changes to the Clean Water Act (CWA). EPA’s initiatives were contained in a proposed rule designed to dramatically alter existing practices for controlling levels of pollution in bodies of water throughout the country. The CWA program targeted by EPA is, like most Washington regulatory contrivances, known by its acronyms.

Created in 1972 when the CWA was enacted, the Total Maximum Daily Load, or TMDL, program is intended to ensure that the nation’s waters are of sufficient quality for the protection and propagation of fish, shellfish, and wildlife, and for recreation in and on U.S. waterways. TMDLs are used to restore water quality by identifying how much pollution a body of water can receive and still meet state standards. The amount of pollution entering the water is then reduced to that level.

What generated so much opposition to EPA’s move was the agency’s clear intent to centralize decision-making authority over TMDLs in its own hands. The Clean Water Act established a federal, state, and local partnership for stewardship of the nation’s waters, with states given primary and lead responsibility for implementation. EPA’s TMDL rule federalizes the program, expanding the agency’s regulatory reach and enabling it to intervene in decisions the CWA left to the states.

Among other things, EPA’s rule requires states to make comprehensive pollution surveys for individual bodies of water and determine pollution levels for each over the next 15 years. If a state does not abide by that 15-year deadline, or if the agency is not satisfied with the state’s calculations, EPA can step in and set the standards itself.

Furthermore, the rule allows for EPA to use subjective criteria in determining whether states are in compliance. And, according to state environmental officials, the rule does not give states enough time to compile adequate scientific data to support their decisions. While 15 years may appear ample time for states to carry out their TMDL responsibilities, a look at the task EPA is handing them presents another picture. States estimate that over 40,000 TMDLs will have to be established — an average of one per week, non-stop, for the next 15 years.

What this means in practical terms was underscored in a July 6, 2000 letter from the National Governors’ Association to President Clinton. “Given the costs of collecting data in each waterbody, calculating the contribution from each discharger for each pollutant, and devising methods for reducing each contributor’s share, it becomes clear that the states simply do not have the enormous resources necessary to accomplish such a task,” the governors pointed out.

Indeed, lack of reliable data on the condition of the nation’s bodies of water adds to the burdens EPA is placing on state governments and the regulated community. The General Accounting Office (GAO) reported in June 2000 that “the key water quality data available to EPA to identify the number of waters not meeting standards and the number of TMDLs that will be needed are incomplete, inconsistently collected by states, and sometimes based on outdated and unconfirmed sources.” As of 1996, the latest national data available, states had assessed only 6 percent of ocean shoreline; 19 percent of rivers and streams; 40 percent of lakes, ponds, and reservoirs; and 72 percent of estuaries.

In the same report in which it cited the inadequate data on which EPA was basing its rule, the GAO also took issue with EPA’s estimates of what the TMDL revisions will cost. The GAO determined that EPA’s calculations for the cost of compliance would be under $100 million a year were flawed and that the cost would likely be well above that figure. Representatives of state environmental agencies have testified before Congress that actual costs to states preparing TMDLs will be between $1 billion and $2 billion annually.

Responding to a barrage of criticism from Congress, governors, state environmental officials, as well as business and agricultural groups, EPA, in the weeks and days preceding promulgation of the rule, frantically rewrote whole sections of its proposal. It did so, however, without allowing an opportunity for public comment on the changes it had made. In a May 31, 2000 report, the Congressional Research Service concluded that EPA had done little to respond to the concerns raised by stakeholders. Exasperated, the Association of State and Interstate Water Pollution Control Administrators concluded in a June 29, 2000 letter to EPA Administrator Carol Browner “that this set of rules is technically, scientifically, and fiscally unworkable.”

The chaotic circumstances under which EPA pushed through the rule were captured in a statement issued by Rep. Sherwood Boehlert (R-NY), chairman of the House Subcommittee on Transportation and Infrastructure. “On May 24, 2000, I asked Administrator Browner to withdraw EPA’s TMDL proposal because of the overwhelming opposition to these proposals by stakeholders on all sides of the issue and because EPA could not explain how the proposed changes would be implemented,” Boehlert said. “Unfortunately, Administrator Browner has not responded. Instead, over the past month, senior EPA officials have been calling members of Congress, calling interest groups, making conflicting promises, and negotiating changes with select stakeholders in a last-ditch effort to drum up support for these flawed proposals. This is not the type of open, public process one should be able to expect when important federal regulations are under development.”

Convinced EPA’s rule would wreak havoc in communities throughout the country, Congress, with broad bipartisan support, attached riders to appropriations bills barring EPA from spending any money on implementing its TMDL rule in FY 2000 and FY 2001. However, Administrator Browner signed the new rule into law one day before President Clinton put his signature on the appropriations measures.

“EPA is taking this action in the face of overwhelming opposition from the National Governors’ Association, small businesses, farmers, and other landowners across America, and in direct defiance of a directive by Congress to forego finalizing or implementing these new rules this year or next,” commented Rep. Bud Shuster (R–Pennsylvania), chairman of the House Committee on Transportation and Infrastructure.

“All of this is nothing but a political power grab by the people running the EPA,” complained Rep. Marion Berry (D-Arkansas). “They have no scientific reason for doing any of these things.”

By rushing the TMDL rule into law before crucial questions about its content and implications could be answered, EPA opened the door to litigation that will last for years. The resulting uncertainty means that state officials and the regulated community will not know what steps they must take until the courts, a new Congress, or a new administration can resolve the issue. I


Bonner R. Cohen is a senior fellow at the Lexington Institute.

Environmental Justice: The False Promise of Title VI

By Christopher H. Foreman

The last decade witnessed the emergence of a vigorous national movement for “environmental justice.” An improvisational aggregation of groups harboring many specific aspirations and grievances, the movement reflects an intersection of civil rights and environmental consciousness that few foresaw when the first Earth Day unfolded 30 years ago. Back then some African-American leaders were notably skeptical of the emerging environmental awareness as a distraction from more important business: uplifting the poor, then as now disproportionately black and ghetto-bound. But these days environmental themes figure prominently in social justice activism.

But exactly how should we foster greater environmental equity for communities of color? Grassroots activists can agree on this much: powerful institutions (i.e. business and government) must take seriously all community complaints and anxieties, especially regarding facility siting and environmental health. Prodded by environmental justice enthusiasts, authorities have sought myriad ways to respond. Desperate for a federal statutory hook, the Environmental Protection Agency has rolled its dice in one of the few games in town: Title VI of the 1964 Civil Rights Act. But the game to date is a disappointment, a state of affairs unlikely to improve despite EPA’s best intentions and the status of Title VI as a near-sacred civil rights text.

Mercifully brief, Title VI commands that: “no person in the United States shall, on ground of race, color or national origin, . . .be subjected to discrimination under any program or activity receiving Federal financial assistance.” In short, you can’t discriminate on the government’s nickel. Otherwise the nickel may vanish. It sounds simple, but when applied to matters environmental it can lead to vast difficulties.

Now consider the challenge of this approach in the context of contemporary environmental policy. One must show that discrimination on the basis of “race, color, or national origin” has occurred. Environmental justice activists have rarely, if ever, been able to demonstrate in recent years the kind of overtly racist practice that flourished with abandon during the period leading up to the enactment of Title VI. This is unsurprising, since such practices have drastically declined under the relentless onslaught of law, litigation, administrative policing and public opinion. Even where racism continues to take a toll, as it arguably does in the realms of housing and criminal justice, the weight of official policy is usually against it.

Which is not to claim that fairness inevitably prevails in environmental decisions. Monied interests have the same advantages there, especially in access and expertise, that they wield throughout the political system. And an environmental agency cannot be above the law. Hence EPA must adhere to Title VI and avoid dispensing federal funds in a discriminatory fashion. How, though, should EPA accomplish this?

The agency provided a tentative answer in February 1998, when it issued “interim guidance” on the application of Title VI to environmental justice. The guidance was instantly controversial as it, somewhat paradoxically, raised a host of uncertainties in the minds of many state officials and business representatives. The Environmental Council of the States, representing the state environmental commissioners, soon officially opposed it as “unworkable” and lacking in “definitions, standards and methodologies that are precise or based on sound, peer-reviewed science.” A revised version, issued for public comment in June 2000 (after a process notably contentious even by the standards of contemporary environmental politics) aims for much greater specificity. It is a safe bet that neither community groups nor state officials will wax enthusiastic.

There are numerous problems here, beyond the two fundamental realities that environmental decisions are best construed in terms of trade-offs rather than “rights” and that low-income people will always have fewer choices than wealthier ones about where to live and what lies nearby. Activists want to protect community health, and the new guidance promises that EPA will assess the cumulative impacts on communities from new permits when Title VI is invoked in complaints to the agency. But EPA will doubtless remain unable to slay (or even effectively stalk) the beast of cumulative risk for a long time to come. Moreover, there is little or nothing explicitly about health in the new guidance, as attorney-activist Jerome Balter, longtime advocate for the environmentally beleaguered city of Chester, Pennsylvania, eagerly pointed out in a prepared statement to EPA just days after the document hit the Internet.

One potentially ironic effect of the policy could be to further complicate the already daunting challenge facing economic development in low-income, job-hungry places like Chester. At present we have more anxiety than evidence on this point, but the fear cannot be dismissed out of hand. By adding uncertainty and delay to permit approvals, the guidance could make some locales marginally less attractive to risk-averse firms. At the very least, how the policy squares with brownfield redevelopment and the administration’s New Markets initiative (intended to bring jobs to depressed areas) bears careful consideration.

From an activist perspective there are yet other holes in Title VI. EPA can influence permits directly but the siting of facilities much less so. The latter is a state and local matter, and the new guidance is explicit about the distinction. And as legal scholar James H. Colopy observes: “Title VI prohibits only projects with unjustified disparate impacts, rather than all projects that simply have a differential impact upon one sector of a community.” (Here we catch those naughty little trade-offs intruding once again.) But as even casual observers of the environmental justice scene know well, what motivates community protest are not unjustified additions to an existing pollution burden but any addition whatsoever. Yet having examined more than 80 Title VI complaints from communities around the country since 1994, EPA has found none it deems a violation. Given that with enough bites of the apple EPA may eventually find a worm, the ultimate remedy is withdrawal of federal funds from the offending entity. That outcome will surely prompt congressional intervention (assuming federal agencies have the gumption to try it) unless the funding in question is modest, thus risking indifference by the offending party.

Activists can still reliably use Title VI to help delay siting proposals long enough to get sponsoring firms to throw in the towel. Witness the Shintech vinyl chloride plant planned for St. James Parish, or the uranium enrichment facility slated for Claiborne Parish. Both projects in Louisiana evaporated recently when exhausted sponsors pulled the plug. Community groups will continue to be adept at this game, whether discrimination can be plausibly demonstrated or not.

Anyone yearning for a path to significantly reformed environmental policy (regardless of political perspective) must look far beyond the dead end of civil rights law. I


Christopher H. Foreman Jr. is Senior Fellow with the Governmental Studies Program, The Brookings Institution.

 


Bilingual Education: Where’s the English?

By Robert Holland

When Congress enacted the Bilingual Education Act in 1968, it is doubtful Members of Congress anticipated that 30 years later federal bureaucrats would be funding projects like the following in the name of bilingual education:

Developing educational software for students to use to develop written proficiency in Lakota (Sioux).

Lakota is an oral language; no written form exists. Why use federal education dollars to develop one? In a 1997 report, the federal administrator of a $240,000 grant to a South Dakota school district under Title VII (the Bilingual Education Act, now a section of the omnibus Elementary and Secondary Education Act, as revised in 1994) explained that “the Lakota language and Sioux culture are a part of our national heritage and programs such as this will ensure this language and culture will not be lost.” Meanwhile, another South Dakota district was using part of a $1.2 million, five-year Title VII grant for, redundantly, the very same kind of Lakota development.

Cultural preservation may be a noble cause, but what happened to teaching English? When Congress passed the 1968 legislation sponsored by Texas Senator Ralph Yarborough, the objective was to make Mexican and other immigrant children fully literate in English so they would not drop out of school in such appalling numbers.

SSOW (Summer School on Wheels) trip to the rain forests of Costa Rica to offer LEP (limited English proficient) students new experiences.

The report on this $144,000 Title VII project in the Rocky Boy School District in Montana noted that “students gained valuable insights into the rain forests, animals, volcanoes,” plus “9 of the 14 students received passing grades for the trip,” and “overall the trip was a huge success for the children and parents and chaperones alike.”

Okay, it was a cool field trip and a good time was had by all. (Well, almost all. How did five children manage to flunk a field trip?) But what did any of this have to do with teaching English to children who can speak little English?

In Miami/Dade County (Florida), development, creation, and dissemination of “Oli, Ole, Oli Ole” and “Bel Kont Bel Istwa,” two books and corresponding Teacher Manual consisting of poetry and folk stories in Haitian-Creole, with sample lesson plans and assessments.

This was the handiwork of the U.S. Department of Education’s Office of Bilingual Education and Minority Languages Affairs (OBEMLA) by means of a five-year, $2.6 million Project BETTER (Bilingual Education Through Training, Enhancing, and Restructuring).

The BETTER report stated that the focus was on “development of literacy skills in the students’ home language.” It said nothing about making the children fluent in English, the supposed purpose of bilingual education. This comports with the priorities of OBEMLA in administering the so-called 75/25 rule. In 1994, with the Improving America’s Schools Act (the name given ESEA reauthorization) Congress imprudently adopted a requirement that 75 percent of federal bilingual dollars go to support instruction in students’ non-English native languages, with “up to” 25 percent reserved for “alternative” programs that teach English in English.

As bad as that mandate was, OBEMLA has made it worse by interpreting “up to” as an excuse to be stingy with aid for English immersion. In practice, OBEMLA has designated far less than a fourth of grant money to English instruction. OBEMLA has become the coordinator of a bilingual cottage industry that has a vested interest in promulgating cultural and linguistic separatism as opposed to teaching immigrant children English so they can quickly enter the American mainstream.

To be sure, there are proponents of transitional bilingual education who sincerely believe that children can learn English more effectively if they first acquire fluency in their native language. But in practice that can become the equivalent of consigning them to a linguistic ghetto. Students often remain in these non-English programs for seven or eight years, or even longer. Younger children find it much easier to learn a second language and to do so with less likelihood of retaining a pronounced accent, but many students assigned early to bilingual education do not even begin instruction in written English until the fifth grade.

Dismal academic results are the bottom-line indicator of the failure of bilingual education. Miami/Dade County provides an instructive contrast between the locally developed English for Speakers of Other Languages (ESOL) program, and the federal Project BETTER, cited above. In ESOL, the district teaches English learners in English at least 60 percent of the time. A state assessment of writing skills showed that ESOL graduates actually scored higher than non-LEP students. In addition, dropout rates were down and graduation rates up among ESOL children. Meanwhile, Project BETTER, which focused almost 100 percent on teaching in Spanish and Haitian-Creole, showed paltry evidence of academic gains. Similarly, in a federally funded bilingual program run jointly by five rural school districts in north-central Colorado, only 18 percent of pupils in grades 3 to 12 showed any gains at all.

Indeed, there are other federally funded “transitional” bilingual education programs from which no students graduate in a given year. Many schools abuse the rights of Hispanic and other parents by brushing aside their requests to move their children into English-speaking classrooms. They even fail to inform parents that their children have been assigned to bilingual classrooms in the first place.

Americans are finally rising up against bilingual regulation that hurts children and defies common sense. The message is even getting across in Washington, D.C., where last fall the House of Representatives enacted unprecedented bilingual reforms structured to give parents maximum control. School districts would have to furnish parents vital information about the nature and success rates of bilingual education and obtain their informed consent before their children could be assigned to bilingual programs. But final passage may be imperiled by election-year battles in the Senate over reauthorization of the ESEA, of which the bilingual programs are a part.

In any event, grassroots Americans have been far ahead of their Representatives on the urgency of curbing this egregious regulatory overreach. The first major blow came in June 1998, with decisive passage (61 percent “yes”) of California’s Proposition 227, dubbed “English for the Children.” After two years under the state’s new law, requiring most instruction of English learners to be in English, the evidence shows children are benefiting. For instance, a San Jose Mercury News study showed that second-graders in the mainstream classes were up to the 35th percentile in Stanford-9 reading results, while peers remaining in bilingual classes averaged at the 20th percentile. Furthermore, California sparked hope that citizens could reverse such harmful dogma from the federally stoked education industry, and that has ignited a prairie fire of citizen activism.

In Arizona, Rep. Matt Salmon (R-AZ), author of the U.S. House-passed “Parents Know Best” provision requiring informed consent for placement in bilingual program, recently threw his weight behind an initiative, Prop. 203, modeled after California’s Prop. 227. In fact, the Arizona proposition would be even stronger because it would not permit districtwide waivers. Connecticut has passed major bilingual reforms, Chicago and Denver Public Schools have enacted a three-year limit on the time students are permitted to spend in bilingual programs, and Massachusetts is considering ending bilingual education altogether. Years ago, conservative Republicans dominated this issue, but lately Democrats have sponsored many state reforms.

The people led the way and now politicians of both parties are joining to reverse three decades of failed regulatory policy that hurt children and fostered separatism. I


Robert Holland is a senior fellow at the Lexington Institute.

CAFE: Putting Highway Safety at Risk

By Bonner R. Cohen

One of the most popular expressions making the rounds the past few years is: “If it ain’t broke, don’t fix it.” Soon, we will have to update that to say: “If it is broke, don’t keep it.”

Few things could be more broken than the nation’s misguided program to put more fuel-efficient automobiles on our roads and highways. The program, known as CAFE, for Corporate Average Fuel Economy, has not only not achieved any of its goals, but it has had the deadliest of unintended consequences. Thanks to CAFE, thousands of Americans have met premature death in automobile accidents they might have survived had it not been for this ill-conceived program.

How did this happen? Reeling from the shock of OPEC oil embargoes of the mid-1970s, Congress and the Ford administration teamed up to enact the Energy Policy and Conservation Act of 1975. Designed to help Americans reduce their appetite for foreign oil and promote energy conservation, the law established a new federal scheme for regulating the average fleet fuel economy of cars and light trucks sold in the US. CAFE standards have undergone some modification over the years; currently, they are 27.5 miles per gallon (mpg) for cars and 20.7 mpg for light trucks, a category that now includes pickups, minivans, and SUVs.

The only way auto makers could comply with the new federal mandate was to downsize their models. Out went the big, roomy “dream boats” much beloved by an older generation of American drivers, and in came hordes of less-flashy compacts and sub-compacts. Even today’s “full-sized” cars are noticeably smaller than their pre-CAFE counterparts. The new offerings also are lighter than older models, and therein lies the safety problem CAFE has created.

In any collision — whether with a wall, a tree, or another vehicle — the laws of physics come into play. Even with the dramatic advances in automotive technology in recent years, the smaller the damaged vehicle, the more likely those inside will be harmed. The National Highway Traffic Safety Administration (NHTSA) estimates that the downsizing of cars from the mid-1970s to 1982 cost 2,000 lives and 20,000 serious injuries annually. A 1999 USA Today analysis of data from NHTSA and the Institute for Highway Safety came to a similar conclusion, saying that 46,000 have died in crashes they would have survived in bigger, heavier cars since the law mandating CAFE went into effect. A related study of the impact of CAFE carried out by Harvard University and the Brookings Institution found a 14-27 percent increase in accident fatalities that could be directly attributed to CAFE-induced downsizing of automobiles.

Fatalities are not the only problem with CAFE. Meeting CAFE’s arbitrary fuel-efficiency standards is not something auto makers can do on their own. Because the standard measures sales-weighted fleet fuel economy, the result depends on what the consumer purchases. And American consumers are voting with their checkbooks for larger, safer vehicles which also offer comfort and convenience unavailable in smaller cars. SUVs, minivans, and pickup trucks now account for over 50 percent of U.S. sales. Indeed, the SUV picked up where the station wagon left off, and its growing popularity has not endeared it to those who would confine the public to smaller cars.

The attempt to force-feed Americans undersized vehicles they clearly don’t want ignores the dramatic strides in fuel economy found even among SUVs. According to the Environmental Protection Agency (EPA), a mid-size car manufactured in 1975 got an average of 13.6 mpg, while a mid-size SUV produced in 1998 averaged 20.8 mpg. In other words, today’s average SUV gets over 50 percent better gas mileage than the average mid-1970s car.

To its credit, Congress slapped a freeze on CAFE standards in 1995. But every year since then lawmakers have had to withstand efforts to lift the ban and impose even stricter CAFE regulations. The most recent attempt was led by Rep. Sherwood Boehlert (R-New York), who in the spring of 2000 gathered 40 signatures on a “Dear Colleague” letter urging an end to the freeze on CAFE standards. In trying to justify tougher CAFE standards, Boehlert and his allies claimed, among other things, that the program “is critical in reducing US dependence on foreign oil” and “cutting air and carbon dioxide pollution.”

CAFE does nothing of the sort. Imports of foreign oil have actually risen from 35 percent of total U.S. supply to 50 percent since CAFE was imposed 25 years ago. And carbon dioxide is not considered a pollutant even by EPA, which regulates auto emissions. Revealingly, the letter, which was identical to one circulated in the Senate a year earlier, made no mention of highway safety. A look at the fatality statistics tells why. Fortunately, driver and passenger safety won out over automotive political correctness, and yet another attempt to tighten CAFE regulations was beaten back.

In fact, Congress went even further and in October 2000 included language in legislation funding the Transportation Department extending the freeze on CAFE standards to 2003. Lawmakers also instructed the National Academy of Sciences (NAS) to conduct a study to evaluate the effectiveness and impacts of CAFE standards. Among other things, the study will examine the impact of CAFE on motor vehicle safety; disparate impacts on the U.S. automotive sector; the effect on U.S. employment in the auto industry; and the effect of requiring CAFE standards on domestic and foreign fleets. The NAS study is to be completed no later than July 1, 2001.

Making mistakes is only human. After all, America was unprepared for the oil shocks of the 1970s, and lawmakers of that era were reacting to public pressure “to do something” about the “energy crisis.” We now know that there was no energy crisis, but rather a temporary shortage of fuel resulting from OPEC’s decision to reduce oil production. That situation was made worse when Congress decided to impose domestic price controls and rationing on crude oil and refined products in the misguided pursuit of price stability. But when the United States later lifted those price controls, and non-OPEC countries started pumping more oil, the “energy crisis” went away.

Unfortunately, CAFE stayed. And with each day the law remains on the books, more Americans pay for this folly with their lives.


Bonner R. Cohen is a senior fellow at the Lexington Institute.

Dial “0” for Outmoded

By Jim Lucier

Make way for the nation’s newest telecommunications and Internet infrastructure regulator. It operates in secret, holds billion-dollar deals hostage, is prone to sudden and unexplained reversals in policy, and is given to sweeping reinterpretations of its once-limited congressional mandate. To make matters worse, the economic costs and benefits of its actions are only incidentally considered. This agency is the Federal Bureau of Investigation (FBI). The FBI is the lead agency in the Department of Justice’s little-recognized quest to become a primary economic regulator of the information economy.

Today’s FBI has not come to terms with changing technology. At one point, the circuit-switched telephone networks on which it relied for surveillance were state-sanctioned monopolies, monolithic in character, and reliant on centralized switching equipment that processed analog signals using now-primitive methods of communication relay. Such networks were inherently insecure if one had access to the central switching station.

Today’s networks, however, are different. They are digital, packet-switched, and subject to unprecedented user control (such as the encryption of data) at the nodes. By comparison with previous networks, they are decentralized and individual in the sense that the modern Internet is really an interconnected “network of networks” linked only by a common computer language. Networks are no longer run by government monopolies but by service providers in fierce competition with each other. These service providers are also increasingly global in reach and ownership. Additionally, the services these companies offer are ubiquitous in a geographical sense, pervasive in that they touch almost every aspect of modern life, and even mobile so they travel with the user’s every move.

For years, a panicked FBI has been trying to turn the clock back to a time before its traditional surveillance techniques become wholly obsolete. The answer, in the FBI’s view, is to return, through regulation, to network design principles going back to 1968, or at least to require that today’s networks operate as the leaky and insecure networks of ages past.

In 1968 Congress passed language within the Omnibus Crime Control and Safe Streets Act that distinguished between the levels of Fourth Amendment protection guaranteed for switching information — essentially the to-from information part of a call — and the actual content of the call. The switching information (who called whom) was to be provided to law enforcement on the basis of a shall-issue administrative subpoena so long as the information was merely relevant to an investigation. This amounts to a fishing expedition. Call content required the much higher standard of a probable-cause search warrant.

Twenty-six years later Congress revisited the issue, largely at the FBI’s insistence that digitalization of switched-circuit telephony threatened investigative efforts. In 1994, Congress passed the Communications Assistance for Law Enforcement Act (CALEA) which required telephone companies to engineer their networks in such as way as to provide certain capabilities and capacities for court-ordered surveillance.

“Capabilities” were defined as the extraction of the signaling information and the call content the FBI had been able to access in the past, on the appropriate applicable Constitutional standard. “Capacities” were defined as the number of simultaneous wiretap capabilities the FBI determined to be necessary in a report to Congress. Telecommunication companies were to be reimbursed for the cost of providing these facilities up to $500 million in taxpayer funds.

Trouble started at once. Congress specifically said the FBI was to have no role in setting telecommunications design, technology requirements, or in dictating the precise method by which it was to get the limited surveillance data to which it was entitled. Ignoring Congress, the FBI immediately presented industry with a “punch-list” of demands that went far beyond anything envisioned by even the most expansive reading of the statute. The FBI sought the capability to track the locations of cellular telephone users in real time even if the phones were turned off, despite previous indications to Congress that such capabilities were not needed.

An impartial observer could see the CALEA standards-setting process as characterized by bad faith, extreme intransigence, and attempted intimidation. The FBI even went so far as to attempt to have the accreditation of the industry standards-setting body revoked when it resisted implementing the FBI’s extra-legal punch-list in full. Ultimately, the FBI resorted to litigation to obtain much in the way of enhanced surveillance capabilities that Congress did not grant. At the same time, the cost of implementing CALEA rose from the initial $500 million authorized in the statute to a figure that runs in the tens of billions of dollars, much of this borne by the cellular telecommunications industry.

Having begun with good intentions, including an attempt to make the standards-setting as public as possible, the CALEA statute has by now run aground on two rocky shoals. The first is technological reality. Enacted in 1994 just as the Internet boom was beginning, and using decades-old terminology that already seemed wildly out-of-place, the statute was obsolete from its inception. As a second problem, whether consistent with the intent of Congress or not, the act has allowed the FBI to assume, in practice, the de facto powers of an economic regulator, albeit one without the oversight or accountability any such regulator should have as a check on its actions. Privacy advocates would argue that not just the economics but the civil liberties implications of the FBI’s activities need oversight.

Most disturbing of all, the FBI has attempted to use CALEA as cover for venturing into other forms of regulation, often using laws even more obsolete than CALEA in ways Congress could not have possibly anticipated. For several years, the FBI attempted to leverage Cold War-era defense trade controls — meant to keep war material and technological secrets from passing into the hands of Communist adversaries — into domestic use controls on widely available, freely published encryption software, which is one of the most basic technologies needed for secure networked computing. Had the FBI been successful in this campaign, annual costs imposed on the Internet and information technology industry would have run in the billions to tens of billions of dollars, with the prospect of heavy regulation in a previously unregulated industry. Recently, the FBI has used provisions of the Defense Production Act to block a string of proposed telecommunications mergers with the intent of using leverage gained in secret negotiations to transfer controls native to the highly-regulated world to the more lightly-regulated Internet service provider world as well.

Additionally, the FBI is attempting to establish itself as the arbiter of technical standards for many multilateral governmental agencies and international non-governmental standards-setting bodies, regardless of the cost and securities issues raised by the FBI’s weakening of network security.

There is also the question of CARNIVORE, the name of an FBI-developed packet-sniffing program that scans reams of information, regardless of Constitutional protections, on the asserted theory that it may legally do so without a warrant as long as it retains only the IP address information. The FBI argues that this information should have the same lower level of protection as the old “who called whom?” data — so they act as if their assertion is true. (Critics, by contrast, argue that IP-address information is much more sensitive and individual-specific than phone numbers on an old-style public telephone system.)

Regulatory scholars who have had the benefit of the last quarter century to study the excesses of the EPA and OSHA must now urgently turn their attention to the FBI as it is an agency the public hardly associates with economic regulation at all, but which nonetheless makes decisions with enormous cost implications for technology providers and equally grave civil liberties implications for the society that relies on them. I


James Lucier is a Washington, D.C.-based securities analyst following Internet and e-commerce regulatory trends. He was previously Director of Economic Research at Americans for Tax Reform. Lucier has been active in matters of tax, trade, and technology policy for over 16 years.

But Is the FDA Safe and Efficacious?

By Merrill Matthews Jr., Ph.D.

Twice a month I take my seat beside physicians and researchers, along with a few nurses, statisticians and attorneys, in a large conference room at one of the country’s top medical schools.

It’s the regular meeting of the Institutional Review Board (IRB) for Human Experimentation, and the IRB’s job, required by federal law, is to review all of the human research experiments being undertaken at the medical school.

I am there as an ethicist and patient advocate. Although many of the board members are paid staff of the medical school, for most of us the time we give to the IRB is voluntary. I suppose that’s a good thing. If IRBs paid their members for their time, research on new drugs and medical devices would be even more expensive than it is currently.

Conceiving and creating a new drug or medical device is only the beginning. Then comes the testing process that will take years of human trials to see whether the drug is safe and efficacious. According to Dixie Farley of the federal Food and Drug Administration (FDA):

“[T]he FDA’s decision whether to approve a new drug for marketing boils down to two questions: (1) Do the results of well-controlled studies provide substantial evidence of effectiveness?; and (2) Do the results show the product is safe under the conditions of use in the proposed labeling? Safe, in this context, means that the benefits of the drug appear to outweigh its risks.”

Of course, comparing the cost versus the benefits of a drug can be very subjective, since different people differ in their willingness to take on risk. People also differ in their chemical makeup. A drug that is very effective for one person may not work for the next four and make the fifth ill.

As a result of this FDA-required process, patients who could benefit from a new drug may wait for years — too long for some — before it becomes available. And when it finally does reach the market, they will pay a lot more for it than they should — and for many, a lot more than they can afford.

It wasn’t always that way. Under the Food and Drug Act of 1906, the federal government only prohibited interstate commerce in adulterated or misbranded drugs, foods and drinks. The law was an attempt to protect patients from blatant fraud, not from themselves (that is, an informed decision to take a medication). That law didn’t come about until the Food, Drug and Cosmetic Act of 1938 (FDCA), which Congress passed after 107 people died from taking a misformulated drug.

The FDA, exerting its regulatory muscle, used the FDCA to prohibit the purchase of some drugs without a doctor’s prescription. Even so, as with the 1906 legislation, the primary intent of the FDCA was only to protect the safety of patients. All that changed in 1962 with the thalidomide crisis.

Thalidomide led to birth defects in thousands of babies, mostly born in Western Europe. And even though the FDA had never approved the drug, there were some U.S. victims, since the manufacturer had distributed 2.5 million sample packages to American physicians. Perceiving public support for stronger drug regulation, Congress passed the Kefauver-Harris Drug Amendments in 1962 to ensure both the safety and efficacy of new drugs. Thus, drug companies not only had to prove that new drugs were safe, they had to prove they worked.

Kefauver-Harris may have been the most costly piece of regulatory legislation ever passed. Currently, moving a new drug from inception through the approval process takes eight to 10 years and costs $500 million to $600 million. If safety were the only thing the FDA monitored, it could take only $50 million and perhaps one or two years to get a new drug to patients.

Imagine how much less expensive prescription drugs would be if the approval process cost a tenth of what it now costs. And the shorter approval time would mean that drug companies would have perhaps six to eight more years under their patents so that recouping their research costs could be spread out over a longer period of time.

Would eliminating the need to test new drugs for efficacy put patients at risk? Not necessarily. The current lengthy approval process guarantees neither safety nor effectiveness. There are drugs that pass the FDA approval process that must be recalled because of adverse reactions. For example, both the anti-diabetes drug Rezulin and the antibiotic Trovan were FDA approved but were pulled after widespread use resulted in liver toxicity in some patients.

Moreover, the FDA’s increasing insistence that drugs must undergo a randomized, double-blind (i.e., neither the patient nor physician knows who is getting the drug), placebo-controlled clinical trial actually puts patients at risk. The safest clinical trial would compare the new drug to the standard therapy, if there is one. If the new drug produces better outcomes than the standard therapy with acceptable side effects, then it should be approved. However, the FDA is demanding that the comparison be done against a placebo, even when there is a possibility that some patients might be harmed by going off medication. Indeed, requiring a placebo has raised some real ethical questions among doctors about whether entering a trial would put some patients at unnecessary risk — especially in trials studying new drugs for such conditions as mental illness and AIDS.

To be sure, the FDA has sped up the approval process and implemented procedures to help get “rescue” drugs to sick patients quickly. But that still leaves the question of whether the FDA should be trying to determine effectiveness.

Were the FDA to drop its demand for efficacy, but require strict physician oversight and the informed consent of patients, the approval process would move more quickly. As a result, more patients would have greater access to more new drugs, and drugs would cost less because the approval process would be so much shorter.

Safety is an easier quality to judge than efficacy, just as it is often easier to determine if something is harming you than it is to know if it helps you. Determining the therapeutic effect of a drug, and whether the benefit is worth the risk, is something that ought to be decided by patients in consultation with their physicians. Keeping that power within the FDA costs us all more and may be harming our heath, or that of those we love. That’s something for which we shouldn’t have to pay extra. I


Merrill Matthews Jr., Ph.D., is a visiting scholar at the Institute for Policy Innovation.

The Futility of Internet Regulation

By Bartlett Cleland

The good news is that, so far, government has not seized total control of technology. One prevailing concern regarding the future of technology is how long until Congress passes legislation that reduces the freedom to develop and use technology and hence the incentives to innovate. Unfortunately, the Internet has been receiving a disproportionate amount of notice from regulators. What often drives the erroneous decisions of government is the erroneous analysis of the solutions. Often decision makers go about attacking the Internet problem as they may have attacked an analog world problem without recognizing that the Internet operates differently.

Many online problems are simply new or have real differences as to the effect of any proposed solution. One good example is the Internet gaming hysteria that currently seems to grip the Capital.

Legislation has been introduced repeatedly to make Internet gambling illegal, despite the fact that forty-five countries now license and regulate Internet gaming. This fact alone makes it a safe bet that a law will not be able to effectively prohibit Americans from placing a bet online given the borderless design of the Internet. That is to say that a U.S. law cannot effectively prohibit a foreign Web site from displaying any particular content.

The legislation does not make placing a bet a federal crime. So how does this legislation attempt to enforce the ban? It would deputize online service providers (OSPs), those that allow a user access to the Internet. After notification by law enforcement, OSPs would be required to deny access to Web pages. At the worst, this may force the Web site operators to change Web addresses — a process that takes about 45 seconds. The real trouble is that it misunderstands the role of OSP, and treats Internet access as a government-granted privilege. The U.S. Congress wants to decide who can access the Web and under what conditions.

Regardless of the OSP approach, the Justice Department will not be able to reach sites operating in foreign countries — especially those operating with the blessing of foreign governments that license their activities. So, prosecutors are left pursuing every individual gambler who may have bet lunch on the outcome of Monday night football. The bottom line — law enforcement will now chase bettors instead of drug dealers, gamblers instead of kidnappers.

Of course, we cannot as a society, allow advancements in technology to overrun our ability to govern. But ineffective legislation does nothing more than contribute to lawlessness. If citizens feel it is appropriate to break a law — merely because it is unenforceable — we ultimately weaken the underpinnings of our society with respect to all crimes, regardless of seriousness.

Another example is the rush to force the use of Internet filtering devices. Internet filters are software programs designed to limit the content a person may be able to view online. Some of the greatest demand for filters has been to exclude pornographic, violent, or other fringe content. Some filters actually restrict content to a class of viewing — say Christian Web sites.

Many attempts to require the use of Internet filters in schools, libraries, or other places of public Internet access have been undertaken. In all of these cases schools and libraries would be forced to implement the use of filters or face a dramatic decrease in the amount of money they receive from the federal government for their operations.

The problem is that Congress has viewed filters as an absolute solution to a perceived problem of children having access to dangerous information online in public places. In fact, filters are not an absolute solution. Much news has been made in the popular press of the shortcomings of Internet filters. That is because filters are a tool, not a solution. Filters in fact are very useful tools for limiting Internet access. The problem with mandating filters (putting aside all questions of federalism and unfunded mandates) is that this approach takes away the emphasis of teacher or parental discretion and control.

In the case, Congress has not understood the technology or how it is really used. Teachers and librarians routinely report that they do not allow unrestrained access to the Internet in their schools or libraries. So, while filters may be helpful in some cases they more often than not represent an unnecessary expense or wasted software. As to the technology itself, Congress again has over-estimated, or more exactly, wrongly estimated the abilities of technology.

A simple explanation of how the Internet is constructed would answer the question as to why these misdirected governmental approaches will not work despite good intentions. The very construction of the Internet is to avoid central control. In the original design the predominant thought was to create a system that could withstand a nuclear attack. Or, said another way, the Internet was built to be able to overcome obstacles, no matter how big or small, and in these cases government is merely another obstacle. Given these starting points, certain design features were necessary and sensible.

The Internet does not have any central organization point or system. The Internet is not a thing that you can touch but rather is a function of the computers connected to a system that is designed to support inter-computer communications.

As importantly, the system has “smart ends” and a “dumb middle.” That is to say that the computers, and the people operating those computers, supply the only “intelligence” to the Internet. The middle, or the “pipes,” merely transports the information provided from one computer to the next. The computers on the ends reassemble the information in a readable way from information “packets.” This is why online service providers, or OSPs, cannot and should not be held responsible in any way for the material sent via the Internet. Along the same lines, FedEx is not held responsible for the contents of the packages they transport.

So why is it that government regulatory approaches will not work? They target the dumb part of the system, the least responsive, and the least “impressionable.” What policy makers in general have yet to acknowledge is that the Internet is not a telephone system — no central control point exists. In fact, the very design of the Internet is to guarantee that no central control point exists. In this manner the power is at the ends of the system — the power is in the individual.

For many years those who fought to preserve human rights or expand civil liberties argued that a central federal government was necessary to guarantee these liberties. Today, those same organizations stand opposed to government involvement online. They support the notion that Internet regulations (whether in the form of mandatory filtering devices or a ban on some Internet gaming) are the swiftest means to end free speech. In the same way they have argued for the allowance of robust encryption as a means for individuals to communicate without the interference of government.

The point is that even traditional defenders of big government have come to understand that the nature of the Internet has given power back to the very place that the Founding Fathers thought it most safely resided — the people. Whether lawmakers like this notion or accept this fact is largely irrelevant. A central government approach that assumes central control will never work, by design.

While many may oppose governmental regulations for a variety of reasons, a new reason should be added to reflect the difference of the online world. All government regulations aimed at the Web are sure to fail as they inherently cannot work regardless of intention. I


Bartlett Cleland is the Director of the Center for Technology Freedom at the Institute for Policy Innovation (IPI). He was formerly Technology and Policy Counsel for Americans for Tax Reform, and earlier, counsel to U.S. Senator John Ashcroft. He may be reached at Bcleland@ipi.org.

Small-Business Employees Face Roadblock to the Investor Class

By Timothy Heitman

At the dawn of a new century, we have witnessed a great, new economic phenomenon here in the United States — the rise of the investor class. Stock ownership has expanded from 15 percent of the American population in 1980 to a level of 50 percent today. While great progress in this area has been made over the past twenty years, one group has been left behind — the over 50 percent of all Americans employed by small business. Congress should move quickly to remove the federally enacted barriers that limit the ability of small businesses to offer their employees meaningful retirement programs.

The origins of today’s “roadblock” came about in the late 70s and early 80s when Congress began drafting legislation to address the nation’s underfunded retirement needs. The legislation, among other things, established the 401(k) retirement plan which enabled workers to make pre-tax contributions into a tax-deferred account. Thus, money that would normally be subject to income tax is invested directly in the workers’ retirement plan where taxes are deferred on investment earnings until the time of withdrawal at retirement.

At the time, Congress wanted to give company owners and highly compensated employees an incentive to encourage 401(k)s participation by the less highly compensated employees. To reach this end, the legislation instituted a complicated testing procedure to detect inadequate participation by the less highly compensated which in turn limits the amount the more highly compensated could defer. The goal was to make sure owners and management made every effort to encourage maximum levels of participation by all eligible employees. Many believed that this testing procedure would force employers to match a portion of contributions by their employees to insure increased levels of participation which then enables the more highly compensated to defer the maximum amount allowable under the law. This “intention” was eventually codified into law under the “Safe Harbor Act” in which employers, in return for matching part of an employee’s contribution, were exempted from some of the more onerous aspects of the testing procedure.

From today’s perspective, one can conclude that 401(k)s have been an unqualified success. Whether the high levels of participation resulted from the testing requirements or just the excitement that comes from rising markets is subject to debate. The one unfortunate consequence of this testing requirement is cost. In most cases, the plan sponsor (the employer) must retain a record keeper and a third party administrator to insure the retirement plan is in compliance with testing requirements. The cost for these services can range in most instances from $3,500 to $30,000 annually, depending on the size and complexity of the plan. This has had the effect of pricing 401(k)s out of the reach of many small companies. This is significant because 52 percent of all Americans work for companies with 50 or fewer employees while 20 percent of all workers are employed by firms with 20 or fewer employees. Of course, new technology companies begin the same as other companies — small and lean. Many of these individuals are essentially frozen out of the retirement savings system.

To address this problem, Congress passed legislation creating Simple IRAs and 401(k)s for companies comprised of 100 or fewer employees. While removing the testing burdens, the legislation required employers to make either non-elective contributions on behalf of each eligible employee or match dollar for dollar a portion of each participating employee’s salary deferral. Anyone familiar with the vagaries of small business knows that in many cases, these hard working entrepreneurs are often struggling just to make payroll and keep the lights on. For many, the mandated matching requirements of the Simple IRA and 401(k)s are risky burdens they can not afford to take on. This does not represent selfishness on the part of small business owners, but merely the reality that in many cases they must jealously guard their cash reserves merely to survive in business.

In the name of fairness, Congress has placed a burden on small business that not all can bear and in so doing has denied many the chance to save for retirement. To bring the employees of America’s small businesses into the retirement savings mainstream, Congress should either exempt companies with 50 or fewer employees from the required employer matching contributions in the Simple IRA and 401(k)s or exempt these same companies from the testing requirements of 401(k)s. Employers could still contribute, but it would be voluntary. This would insure that every small company in America would be able to provide their employees with some sort of salary deferral program, thus enabling them to begin saving for their retirement. With the long term survivability of Social Security very much in question, Congress needs to move quickly to insure that this group of American workers has a chance to save for their retirements in a manner similar to those who work for large companies. It would be both prudent and fair. I


Timothy J. Heitmann is a fianncial advisor at First Union Securities, Inc., in its Washington, D.C. office - 202-828-8118. First Union Securities, Inc. (FUSI) is a separate, non-bank affiliate of First Union Corporation.

The foregoing is for informational purposes only. It was prepared from sources believed to be reliable but is not guaranteed as to accuracy, and is not a complete summary or statement of all available data. The opinions expressed represent those of Tim Heitmann and do not necessarily represent those of First Union Securities, Inc.


The Children’s Online Privacy Protection Act

By Jim Harper

The road to hell, as they say, is paved with good intentions. If it needed fresh paving stones, the Children’s Online Privacy Protection Act (COPPA) surely has provided them. Passed hastily by Congress in 1998, the law took effect in April 2000. Its sponsors and proponents in Congress and the Federal Trade Commission (FTC) had good intentions, but more than good intentions is needed to justify new federal laws.

The result of their work was a set of regulations that do little to protect children, that mislead parents, that burden the new Internet economy, and that reduce the availability of interesting and educational content for children — especially children on the margins. Meanwhile, the most voracious consumers and abusers of personal information — governments — have been given a pass.

The Internet did not create new uses of information, but it accelerated and put a very public face on information practices that have been evolving for years. Businesses and marketers collect information about consumers and use it to tailor their products and advertisements. The result is better, cheaper goods and services for everyone.

Innovative uses of information improve people’s lives, but the idea grew in Washington that children should be protected — yes, especially the children — from losing their privacy. Children, after all, are vulnerable and less aware of the boundaries between private and non-private information. This pointed directly to regulating the Internet, and Congress followed along faithfully, passing the COPPA law after just one Senate hearing.

The COPPA regulation promulgated by the FTC requires commercial Web sites and online services that collect personal information from children under 13 to post privacy policies, notify parents of their information practices, obtain verifiable parental consent, and provide parents with access to their children’s information. Violators of the law are subject to FTC enforcement actions, including civil penalties of $11,000 per violation.

Congress was mistaken to assume that commercial Web sites pose a significant threat to children’s privacy on the Internet. Nothing spices up a congressional hearing like an innocent citizen who has been victimized, so lawmakers naturally would have brought children in to testify tearfully at the COPPA law’s one hearing. But no children could be found to tell about privacy invasions by commercial Web sites. Commercial sites actually pose little danger to children precisely because they are commercial. Businesses survive by making children and parents comfortable and safe, not by abusing them.

This puts commercial Web sites in the same category as parks, skateboards, the candy counter, and Saturday morning cartoons. Without parental guidance, they may be harmful. But, with COPPA, Congress singled out Internet businesses for special regulation.

More realistic dangers to children lurk in chatrooms and on Web sites operated by individuals and entities that do not cater to the public. But pedophiles and hate groups are not subject to COPPA. The law was like putting bright lights in a supermarket dairy section because of the dark parking lot outside. Moreover, COPPA only applies in the United States, though information travels freely across international borders.

So, with COPPA, Congress cleared the way for families to be victimized twice: once when the government fraudulently tells parents that the Internet is safe, and again when children fall prey to a malefactor on a non-commercial or non-U.S. site.

Just as importantly, COPPA increased the difficulty of serving educational and entertaining content to children. Anticipating the law, many sites that provided interesting content and broadening interaction stopped serving children altogether. NBCi, which owns and operates free e-mail services, canceled them for children under 13. America Online’s popular ICQ instant messaging service canceled the accounts of users whose birth dates placed them under 13. And the popular children’s television show, Thomas the Tank Engine, halted its regular e-mail bulletins, disappointing many thousands of young fans.

Compliance costs for COPPA have been estimated at $60,000 to $100,000 dollars. This has already dissuaded many small entrepreneurs from creating new and inventive ways of -put-ting educational content online for children. Because the COPPA law began regulating an industry before it had come into existence, good ideas have been squelched.

Still other companies have instituted the procedures required by COPPA. Before collecting, using, or disclosing personal information from a child, a commercial Web site must obtain “verifiable parental consent” from the child’s parent. This means getting a signed form via postal mail or facsimile; accepting and verifying a credit card number in connection with a transaction; speaking to the parent through a toll-free telephone number staffed by trained personnel; or via e-mail accompanied by a digital signature.

Thanks to COPPA, many children will be denied access. Which children are these likely to be? Those with absentee or busy working parents, poor children, and children whose parents don’t speak English. In other words, the COPPA law creates a digital divide between children whose parents can give “verifiable parental consent” and those whose parents cannot. COPPA denies access to the children who would benefit most from educational content on the Web.

Distressingly, the most voracious users of personal information — governments — are operating with almost no restrictions. Citizens must reveal tremendous amounts of personal information to comply with the tax laws and apply for licenses and benefits. Agencies and their employees have been known to search and sell such records with impunity. We hear routinely about government plans to snoop on e-mail, chatrooms, Web sites, and telephone calls. Congress failed to make itself, or any other part of the federal government, subject to COPPA.

The COPPA law was rushed through Congress without proper deliberation or consideration. Because of it, families will be lulled into thinking that the Internet has been made safe, fewer innovations tailored to serve children will materialize, and our nation’s at-risk youth will needlessly be denied access to the educational content and healthy interaction that can break the bonds of poverty and ignorance.

This certainly is the worst regulation through the eyes of the children.


Jim Harper is the founder and principal of PolicyCounsel.Com, an Information Age public policy consulting firm. He previously served as counsel to committees in both the U.S. House and Senate.