Rural America Part 3: Development of American Agriculture Up to World War II

In the previous post in this series I presented two models of a farm economy that were widely deployed in the British colonies of North America. In this post we will go over how government action affected the use of both models from the first decades of our country’s independence until the New Deal era just prior to World War II. This review will provide us with some understanding of the problems faced by earlier generations of farmers. That background will make it easier to understand how the American agricultural economy has become a plaything of the wealthy in more recent times.

After our country gained independence, restrictions on expansion westward imposed by the British government were removed. Again, I won’t dwell on the shameful history of our treatment of native Americans. Our focus here is on how government policies shaped which model of rural economy dominated as the country expanded. For more details on the changes in farm policy I will summarizing here, check out this document.

At first, federal policy favored selling newly-seized federal lands in the west to private parties at high prices. Southern politicians favored one goal of this policy, namely the spread of the plantation model. And, in fact, in the south the plantation model continued to expand westward. Wealthy planters could afford to buy large tracts of land. It was cheaper for them to move their plantations westward than to try replenishing the soils they had exhausted by repeatedly planting the same one or two crops they had been growing on their existing plots. The invention of the cotton gin opened a new market for cotton, which happened to grow wonderfully across large sections of the south. This also encouraged the spread of the plantation model, since the intense labor needed to harvest cotton could be obtained cheaply by using slaves.

Over the course of the early 19th century federal policy was repeatedly modified to promote the spread of the yeoman farmer model. Finally, the Homestead Act of 1864, passed during the Civil War when the southern states were not actively involved in US federal policy, settled the issue decisively in favor of the yeoman farm model for newly-settled lands in the mid-western and western US. As a result, by the end of the 19th century the vast majority of agriculture in America was carried out by small farmers. It should be pointed out that this act, and a couple of follow-up acts in the decade of the 1900s led to many new farms in the great plains, many new farmers, and the loss of huge tracts of native prairie land to agriculture. This led to some unforeseen consequences in later decades.

The American economy as a whole was transformed by the Civil War. Manufacturing, finance, and transportation industries exploded in size and economic power. As one might expect from our discussion of the unique vulnerabilities of rural communities, small farms were targeted for exploitation by some of these large businesses. For example, the railroad and riverboat barons raised shipping fees to the point that farmers found it difficult to afford shipping their products to market via rail or boat. Farm communities pressured their federal representatives to give them relief from these monopolies. This is one reason the Sherman Anti-Trust Act of 1890 was passed by Congress.

It would be a mistake to think that these major changer eliminated the plantation model altogether. Instead, via a series of maneuvers powerful landowners in the former Confederate states were able to re-establish the plantation model. First, they were able to break down the Reconstruction Era promise made to former slave owners that they would be able to establish small farms of their own. The famous “40 acres and a mule” policy was weakened and ignored to the extent that the majority of former slaves and their descendants who worked in agriculture in the southern states were either tenant farmers or farm laborers on land rented from the great plantation owners or their descendants.

Family farms did relatively well in the first two decades of the 20th century. As mentioned above, both the number of family farms and the average income of farm owners increased during these two decades. At first this was a result of several factors. Both federal and state government designed policies to protect farmers from exploitation by monopolists, ease their ability to obtain loans, and obtain information on improved farming and marketing techniques. The rise of farm cooperatives and organizations helped farmers pool their power and obtain better prices for their products. Organizations such as the Grange had been lobbying for the interests of family farmers to governments at all levels since its founding after the Civil War and the collective result of these efforts was also bearing fruit (sorry for the pun!) at this time.

When World War I broke out in 1914 European farm output dropped. Many American farmers, who were already doing relatively well, responded by quickly increasing production. During the war years American farmers prospered. The war ended in 1918, and by 1920 European agriculture was well on the way to recovery from the devastation of the war. As their production went up, the need for imports from America declined. American farmers who had rushed to increase production could not find another market for much of the crop they had produced. As a result, the price of agricultural products dropped. Farmers who had taken out loans to pay for lands they had bought to grow more crops or to buy seed and equipment were no longer able to make enough money to keep up with loan payments.

The federal government made some attempts to address farmers’ concerns in the 1920s but they didn’t lead to significant improvements for most farmers. Then twin disasters struck between 1929 and 1935. The Great Depression caused prices for agricultural products to drop precipitously, leaving many more farmers unable to make loan payments. Many farmers responded at first by trying to increase yields, but that just pushed prices down even further. Starting in 1931 a multi-year drought struck the mid-west. Crops failed. Many of the newer farm families were relatively inexperienced with drought conditions and had not developed strategies to deal with them. As a result, millions of acres of land were left bare. High winds carried off topsoil in dust storms. With their farms ruined, many farm families had little choice but to abandon their homes and move. All told, about 2.5 million people migrated out of the great plains states due to economic hardship during the 1930s.

The New Deal program initiated by the FDR administration included several measures addressing agricultural issues specifically. These measures altered the economics of farming fundamentally and laid the legal parameters for the recovery of American agriculture that took place starting in the 1940s.

The primary piece of legislation included in FDR’s New Deal for agriculture was the Agricultural Adjustment Act of 1933. This law, with some follow-up legislation in 1934 and 1935, set up a subsidy system for production of major agricultural commodities: wheat, corn (maize), hogs, cotton, tobacco, rice, milk, rye, flax, barley, grain sorghum, cattle, peanuts, sugar beets, sugar cane, and potatoes. Farmers were encouraged to work with the newly-formed federal agency, the Agricultural Adjustment Administration, to arrive at a level of production that would enable the farmer to receive a price for their produce as close as possible to the target price fixed by the law. The law specified that the target price would be the price for the produce in the years 1910-1914, when prices for farm products were high enough for most farmers to do well financially. If the farmer was forced to sell her produce below the target price the federal government paid the farmer a subsidy to bring her total receipts up to what they would have been if she had been paid the target price.

This law stabilized agricultural markets and saved many farmers from going bankrupt and many farms from foreclosure. The law was drafted largely based on the yeoman farm model, where the owner of the farm was also the person working the land. The federal agency worked directly with landowners, not farm laborers or tenant farmers. In a sharecropping situation, the federal government reimbursed the landowner if the price of the produce sold by tenant farmers using that land did not meet the target price. The law included provisions to ensure that for any subsidies awarded to landowners, appropriate percentages would be shared with tenant farmers or laborers who had worked the land. Southern lawmakers were able to weaken the subsidy sharing provisions and some agency employees overseeing the program in southern states purposely failed to enforce the provisions. As a result, tenant farmers in the south were largely driven out of business in the following decades.

The Supreme Court overthrew the Agricultural Adjustment Act in 1936 but the FDR administration was able to pass modified legislation tin 1938 hat passed muster with the Supreme Court and continued the subsidies. Congress also included in this legislation provisions to help mid-west farmers recover from the dust bowl devastation and change farming practices to protect and preserve topsoil.

The sum total of the changes brought about by the New Deal brought many American farmers teetering on the edge of bankruptcy and desperation into relative prosperity. In other cases, e.g. the “Okies” who migrated from the midwest to California to escape the Dust Bowl or the southern tenant farmers we mentioned above, the New Deal programs were not enough to save them from impoverishment. As a whole, however, the use of the plantation model was greatly reduced. Most farms remained of moderate size and operated on the yeoman model.

After World War II this situation changed drastically. What happened to the farm economy then and what it did to rural communities will be the topic of our next post.

A Primer on the Unique Economic Vulnerabilities of Rural America

Publicly-accessible

This fourth post in my “If You Can Keep It” series appears at the end of a week that included the first debate in the 2024 presidential election, a debate that has many of those 50 million Americans who watched it wishing that there were some other viable candidate besides the two that have been chosen by the two major parties. It also included a series of supreme court rulings that will require major changes in the way our government functions and could lead to our federal government changing from a democratic republic to a dictatorship.

Those of you who have been following this series so far may decide to spend your time focusing on these other issues for now. I understand that completely. It seems to me as well that a five-alarm fire has broken out in our society. As scary as some recent events may be, there isn’t much many of us can do about them right now. Also, understanding the context and background events that led to the rise of Donald Trump and Joe Biden and the recent supreme court decisions can help you take a longer view and make better decisions about what you ask of your government representatives and who you vote for come November. The goal of this series is to provide that critical background. I hope you will stick around, and if not, that you will come back later in the year (preferably by September) to catch up on the posts you have missed and share them with others, especially people you know who may be undecided or disenchanted about voting this November.

In the introduction I indicated that one current, fundamental problem of American society is that the extremely wealthy have managed to reconfigure the economy and government policies to give them more freedom and take it away from everyone else. As Warren Buffett put it, “There’s class warfare, all right, but it’s my class, the rich class, that’s making war, and we’re winning.” One of the most obvious illustrations is the accelerating decline of rural American life. Most of the people who will read this series likely don’t know much about the vulnerabilities unique to rural and agricultural communities. Because it was so easy for corporate powers to exploit these vulnerabilities, it also makes it easier for me to illustrate trends that, with some variations, affect the rest of us.

My nuclear family has a personal stake in this issue. We have lived in or near farm communities for most of the last 30 years, including 18 years operating our own small horse farms. Reader, you have a stake in this issue as well, because, for better or worse, what happens in these communities affects what you eat every day, the kind of building in which you live, and how you get around.

One of the major virtues needed to thrive in a rural community is self-reliance. Since you have few others nearby you can rely on to supply your needs, you need to learn how to fend for yourself in many difficult circumstances. This principle broadens out as people in a rural area band together. A family living far away from others needs to be able to supply its own needs by relying only on those in the family. A small community group living far away from others needs to do the same. In fact, we are told that human beings lived in small, relatively isolated groups like this for most of human history.

Unfortunately, living with only a small, relatively-isolated group of human beings to rely on is a risky proposition. Not that others in the group are unreliable or malicious, it’s rather that there are too many potential catastrophic events that a small, isolated group doesn’t have the resources to protect one another from. For that reason, most people living in rural areas have depended on resources provided by a larger society for at least some of their critical needs.

Relying on a larger society’s resources to support a rural lifestyle comes at a cost. Put yourself for a moment on a farm several miles outside a village in central Kansas, Nebraska, Iowa, or North or South Dakota. Since you are relatively far away from the larger society, making contact with it is expensive and time-consuming. Not many from a more urbanized environment will be willing to invest the time and money to bring the resources you need. Those who do will aim to be your sole provider. If they can pull that off, they can demand a higher price in exchange for the resources they provide. The technical term for this type of arrangement is a monopoly. or single-seller situation.

Conversely, in order to pay for the resources coming from the outside, you have to trade something you’ve gathered, grown or made. Those who are willing to spend the time and money to buy the items you have for trade will aim to be the sole purchaser. If they can pull that off, they can pay you a much lower price than you would be willing to accept if there were someone else you could sell it to instead. The technical term for this type of arrangement is a monopsony or single-buyer situation.

American government has always been aware of the potential for exploitation of rural citizens by outside entrepeneurs. For example, when the country was very young cross-country travel was extremely difficult. Not having the funds to build and maintain roads, governments relied on private investment for road-building. In order to recoup their investments, the companies that built these roads charged tolls. When the state of Pennsylvania contracted with a private company to build a turnpike between Philadelphia and Lancaster, PA, they set toll prices to prevent the company from charging excessive tolls, since the turnpike was by far the fastest and easiest way to get into the interior of the state.

The twin curses of monopoly and monopsony have frequently plagued the rural inhabitants of North America for hundreds of years. Here is a brief list of examples in roughly historical order:

Let’s pause at this point to reflect on the railroads. Railroads are a prime example of what is commonly called a natural monopoly. It costs a lot to build track and trains by design can only run on track. If you want trains running in both directions along the same route, you either have to build two tracks or include frequent and long sidings that allow one train to get out of the way temporarily for another train coming down the same track in the opposite direction. The engines, cars, and fuel needed to operate a train are all very expensive. For these reasons, it is impractical for several different outfits to provide similar train service to the same group of customers. Instead, a single railroad company typically built out all the rail lines that served a specific community or region, depending on the size of the railroad company.

In the late 19th century our federal and state governments by and large failed to follow the example of the state of Pennsylvania’s contract with the turnpike company by regulating the rates the railroads could charge their customers. Railroads were allowed to charge customers whatever they could get away with. Furthermore, it became customary for railroads to provide special rebates to certain customers. Most of us are used to this type of differential, “buy-in-bulk” pricing. In the late 19th century, however, the practice came with some shady extras. For example, a railroad company would make a private agreement with, say John D. Rockefeller’s Standard Oil Company, to ship the oil at a huge discount. In exchange, Standard Oil would ship enough oil to occupy an entire train’s worth of freight and handle all the loading and unloading. This arrangement saved the railroad enough to offer the discount without cutting into its own profits. But Standard Oil added a couple of other provisos. The railroad agreed to either a.) not offer the same discounts to other oil producers or b.) not ship oil for other oil producers at all. (By the way, Standard Oil became a monopoly in the oil distribution industry and was broken up by the federal government in 1911 for violating the 1890 Sherman Anti-Trust Act.)

The Theodore Roosevelt administration brought the railroad monopolies to heal with the help of Congress when it passed the Hepburn Rate Bill in 1906. This bill outlawed the type of discriminatory rebates discussed above and set minimum and maximum shipping rates. President Roosevelt made several public speeches in advance of the votes on this bill, arguing that the government needed to step in to regulate the railroad monopolies because left on their own they would continue to reward large corporate customers with special deals and punish small businesses like farmers and consumers with higher prices simply because these customers had no other option. You can read more about Roosevelt’s reasoning in support of federal regulation of railroad shipping rates here.

In the next few posts we will examine how large corporate interests gained monopsony power over farmers and what has happened as a result. We will also explore how Walmart built its retail empire in part by impoverishing small towns. Finally, we will examine how government’s corporate-friendly approach to healthcare has led to the disappearance of healthcare providers in wide swaths of rural America.

Who Should be Free? Part 2: Employee Non-Compete and No-Poach Agreements As One Example of the Wealthy’s Ongoing Political Warfare Against Ordinary Americans

Publicly-accessible

As I promised in my previous post, this post will examine the role employee non-compete agreements — and the even more nefarious and hidden phenomenon known as no-poach agreements — play in the political assignment of civic freedoms.

Imagine it is 2013 and you are looking for a part-time job in retail food service to supplement your income. When you were a teenager you had worked in a local grill and despite the relatively low wages you had appreciated the schedule flexibility and the companionship of your co-workers. You figure that you could handle the responsibilities of a part-time shift manager at a place like that and earn enough to meet your budget goals.

Over the course of the succeeding decades most of the local fast-food joints have been put out of business by competition from national chains that now dominate in the nearby commercial district. “How much different can it be to work in a chain restaurant?” you figure, and submit applications to a number of them.

Eventually, you score an interview at the local Jimmy John’s sandwich shop. The hiring manager is impressed by your maturity and offers you a job with the encouragement that after learning the shop’s procedures you could easily advance to a part-time junior manager. “That’s the ticket!” you say to yourself and decide to take the job.

Ah, but! During the initial sign-up process you are handed the employment contract and — being the type of person you are — rather than assuming that the contract has all the usual legalese about at-will employment and prohibitions of various types of illegal behavior, you read start scanning through it until you hit the paragraph that says you agree not to work for any other establishment that makes and sells sandwiches within two miles of a Jimmy John’s shop while working for Jimmy John’s and for the next two years after the end of your employment. “WTF?” you say to yourself and tell the hiring manager, “I wasn’t expecting a restriction like this. Thanks for the offer but I’ll have to decline.”

You leave the shop with a sinking feeling after the hiring manager wishes you a cheery, “Good luck!” Over the course of the next few weeks you get similar offers from an Arby’s, a Subway, a Panera Bread, and a Jersey Mike’s. Guess what, they all had similar language in their employment contracts! You finally luck out at the local McDonald’s. This time the interviewer hasn’t painted such a rosy picture of an up and coming management position, but between relief that the employment contract doesn’t include a non-compete clause and knowledge that there are plenty of McDonald’s restaurants with potential management openings within driving distance of your house, you take the job.

As expected you learn the operations of the business quickly and within a few months receive multiple, albeit small, pay-raises. Anticipating that you will soon be a viable candidate for a management post you make a habit of watching the online management training videos offered by McDonald’s and practicing as much of their recommendations as possible for a team member.

Your McDonald’s has a bulletin board in the break room with a section on local McDonald’s job postings. Unfortunately for you, none of the jobs posted on that board are management-level. You’ve known since you started that a lot more McDonald’s job openings in your community are posted on McDonald’s website than show up on the bulletin board, probably because there just isn’t room to post them all.

You search the job openings on McDonald’s website for any part-time shift manager openings nearby and sure enough, a couple of McDonald’s in other parts of town have openings for a part-time manager, so you apply. After not hearing back about either position, two weeks later you make a follow-up phone call to the phone number listed on the application website to find out where you stand. You notice that both openings listed the same phone number. A human resources rep answers the phone. Here is the brief conversation:

“Based on your prior experience you are not a suitable candidate for either position. Thank you very much for applying.”

“What could I do to improve my chances for the next opening?”

“Your best bet is to apply for the positions posted on your local job board. Each McDonald’s franchise has its own hiring criteria and the franchise owners prefer to promote from within their own staff.”

You knew the franchise owners were your real employer, not McDonald’s corporation. The McDonald’s application website makes that clear as day. Each franchise makes its own personnel decisions. You now realize that the two locations with openings belong to a different franchise from the one that runs your location. You also realize that all the job openings posted on the local job board at your location are at locations run by the same franchise that runs your location. You are still puzzled by the human resources rep’s advice. It’s not like each franchise is allowed to have its own management philosophy. McDonald’s insists that managers at all facilities using its brand train for the same attitudes, procedures, and skills. Also, a preference for promoting from within isn’t the same as a law, but it is pretty obvious from their answer that trying to get hired for a position in that franchise was likely to fail. Yes indeed, time for another “WTF?”

Clearly, your options for turning this side-hustle work into a promising addition to the family’s income have narrowed considerably. I’ll bet you feel, not free, but trapped. What is going on here?

You have become a victim of two different schemes the retail fast-food industry has employed to reduce costs by suppressing the wages of their employees. You were confronted directly with the employee non-compete clause in the employment agreements you were asked to sign. The use of these non-compete clauses spread throughout the industry over the last few decades and only a few companies, such as McDonald’s and Chick-Fil-A, never used them.

But employee non-competes are just the tip of the iceberg. Franchisees and the owners of national fast-food chains have a common interest in keeping wage costs low. Enter the “no-poach” clause in franchise agreements. When someone bought a McDonald’s franchise, they agreed that they would not hire anyone employed by a McDonald’s run by a different franchisee. That would prevent a spiraling wage war between different franchisees operating in the same market. Nobody told you about this arrangement between McDonald’s and its franchisees. One reason is that they don’t have to. There are many other reasons, all colored green. Hence the person on the other end of that phone call giving you BS about why you weren’t suitable for the management positions. There was only one reason you weren’t suitable: you already worked for another McDonald’s franchise!

So, where did such arrangements come from? It’s not anything new. In the old economic arrangements that preceded western capitalism slaves, land-bound peasants, and Russian serfs were not free to take their labor elsewhere. Mercantilism and its successor, capitalism, allowed for more mobility for workers from backgrounds of modest means to improve their economic circumstances. In a later post I will provide much more detail on the improvements and dangers for most people in the rise of capitalism.

In the medieval guild system the master craftsmen who trained apprentices would attempt to limit future competition from their trainees by requiring them to practice their craft in another local market after their training was completed. In England the courts refused to honor these agreements due to systemic, persistent shortages in skilled craftsmen. This was an early victory for labor freedom.

By the time the Unites States gained independence the use of non-competes here was pretty rare, given that the English legal tradition of refusing to enforce them came over with the colonists. In the ensuing decades the country was just beginning the transition from an economy largely based on agriculture to early-stage capitalism. The new circumstances employers and laborers found themselves in prompted US state governments to reconsider their stance on “restraints on trade,” that is various contracts and arrangements that limited what could be sold or bought or who could work for whom. Non-compete and no-poach agreements are examples of “restraints on trade.”

“Wait,” you say, “why didn’t the US government regulate this?” OK, let’s revisit our high school civics and/or American history classes. As prescribed in the constitution, government in the USA operates under the federal system. The US constitution defines the limited roles and powers of the national government and leaves large areas of government action to be handled by the individual states. Employment contract laws fall within the jurisdiction of the states. State laws have varied in how they handle non-compete clauses ever since the country was established. A few states, e.g. California, have had near-absolute bans on employee non-compete contracts. Many more have allowed for them but under limited circumstances, with some states tending to enforce them in more cases than other states. Much of this variation arose in the course of the 19th century as states adjusted their contract laws to deal with the increasingly complicated relationship between growing companies and their employees.

For a number of reasons industrialization picked up steam in the years after the American civil war. During this period a number of companies grew to such a size that they began to dominate their industries across state boundaries, and their owners and/or executives often employed tactics to block competing businesses based in other states from gaining a foothold or challenging their market dominance. The increasing economic and political power accumulated by these “trusts” began to have such notable ill effects on the health of the economy and the well-being of the majority of American citizens that an alarmed Congress passed the Sherman Anti-trust Act of 1890.

In a later post we will examing the way the federal government has enforced this Act in the last 130 years to prevent or break up monopolies. The Act also made it illegal for businesses to make agreements with each other that would prevent other competitors from damaging the economic power of the businesses included in the agreement. A “no-poach” agreement is a prime example of this type of illegal activity.

In the 20th century non-compete clauses became an increasingly common feature in employment contracts for senior executives, sales managers and their teams, and highly-trained engineers, analysts, and technical workers. This makes some sense. An employee who has intimate knowledge of a company’s proprietary trade secrets and operational procedures, some or all of which make it an effective competitor in the market, could damage the company’s prospects if they take that proprietary information to a competitor or start their own company in the same business. Likewise, someone with intimate knowledge of a company’s clients could wean them away after departing to work for a competitor. Legally enforceable non-compete agreements are a relatively easy way to discourage employees from this type of behavior.

Many experts argue that non-competes are unnecessary in almost all cases, because the main arguments in favor of them can be addressed by other means, such as non-disclosure agreements and trade secret and patent laws. But we will leave further discussion about the status of non-competes for the top 10% of American earners to the high-priced lawyers who specialize in such affairs. Instead, we will concentrate on the naked obscenity of using non-compete and no-poach agreements to steal freedom and wages from low-wage workers.

By the mid-2010s non-compete and no-poach agreements had become widespread in sectors of the economy that employed large numbers of low-wage workers, e.g. hospitality and retail sales. I was unable to find much research on how these types of agreements spread to different sectors of the economy over the last several decades. This is partly due to the difficulties in obtaining good data on these practices. The federal Bureau of Labor Statistics, for example, didn’t even start collecting data on the prevalence of non-compete agreements until 2017 and that decision was motivated in part by a 2014 New York Times report by Steven Greenhouse on the effects of the burgeoning use of non-compete agreements among low-wage workers in Massachusetts.

Another 2017 New York Times article on the bad effects of no-poach agreements on the pay of low-wage workers moved intrepid state attorneys general in Washington, Massachusetts, New York and several other states to issue subpoenas to large companies operating in their state to obtain their franchising agreements. They discovered that many of these companies have no-poach agreements written into their contracts with franchisees. The attorneys general used this information to threaten or actually initiate legal action against the companies. For example, Bob Ferguson, the Washington state attorney general, and his team informed over 200 companies that had franchises in Washington state with no-poach agreements that unless the companies nullified the no-poach provisions in their contracts nationwide, the state of Washington would sue them. As of 2020 the state secured agreements with 237 of these companies to nullify the no-poach provisions in their franchise contracts. This informative and at time hilarious interview with Bob Ferguson by the Pitchfork Economics team is worth a listen.

Why did the companies come to agreements with the states rather than trying to fight them in court? Simple, no-poach agreements are clearly in violation of the Sherman Anti-Trust Act. The companies’ legal teams knew they had no leg to stand on and told senior executives so.

If no-poach agreements are clearly illegal, why did companies use them in the first place? The answer to this question is at least partly tied in with the answer to the question, “Why do companies include non-compete clauses in their contracts with low-wage employees?” Put simply, companies do this because they can get away with it. In both cases companies recognize the vast power advantage they have over low-wage employees and use these agreements to take away some more of the workers’ bargaining power. The workers don’t know about the no-poach agreements, so they won’t complain about them, and they don’t have the resources to fight a legal battle if they try to violate the non-compete agreement. The workers are cowed into staying with their current employer and accepting lower wages and significantly fewer opportunities to advance their careers.

The top executives and major shareholders of retail food service chains realized that if non-compete and no-poach agreements became the standard practice across the industry it would pretty much lock their front-line workers into their current positions or at most to promotions within the same franchise. Freezing worker mobility would not only reduce the employee turnover rate but it would also make it much easier for franchisees to turn down worker requests for more pay. These are the same people who worked hard to keep both the federal and state governments from raising the minimum wage. They benefited greatly from having society at large subsidize their workforce via programs such as SNAP, LIHEAP, and community food banks. They could retain their desperate, stressed workforce without increasing wages, let tax revenues and charity cover the difference, and have shareholders pocket the money that otherwise would have gone to workers.

This scheme works great when there is an excess of workers, but it creates real problems for franchisees when unemployment rates drop very low, as has been the case for a number of years now. When retail workers with experience can’t move between franchises or employers and there are not enough people on the sidelines available as new hires, positions go unfilled, forcing locations to reduce hours and/or services. Workers become more stressed and either leave or engage in theft or sabotage, and customers become irritated. For example, I had been a big fan of a local Wendy’s as a fast-food option due to the salad options on their menu, but after they cut back dine-in hours to a minimum and removed free wifi at all the Syracuse locations I visited, I stopped going altogether, haven’t been back for four years and don’t plan on eating at a Wendy’s ever again.

One possible solution to this problem would be to raise the number of unemployed people. These newly-unemployed would become potential candidates for the the open positions and sales would drop, easing the pressure to hire additional workers to provide full services. In the recent history of the US Federal Reserve they have resorted to raising interest rates when the economy becomes “overheated,” expecting that the economy will slow down. This usually results in people losing their jobs, which would “solve” the worker problem for our franchisees. I’ll have much more to say about how government action can best deal with situations like this in a later post.

The other obvious solution would be to start raising worker pay to retain and attract workers, but many franchises were already squeezed by increased costs for supplies, insurance, rent, etc, the fixed fees they were required to pay to the corporation, and the risk to revenue posed by raising prices. Some franchises raised both pay and prices. Some raised pay and took the hit to their own profits. Some decided to not raise pay and live with the consequences. Regardless of the franchisees decisions, the corporations have not adjusted their fee structures to help out franchisees.

Since economists began systematically researching the effects of these agreements on low-wage workers evidence has accumulated that worker pay is about 4-5% lower than it would be otherwise in states that enforce these agreements. That evidence has led some state legislatures to pass laws restricting or outlawing the use of these agreements. The federal National Labor Relations Board issued a new rule in 2023 that treats corporations and their franchisees as joint employers, now both subject to unfair labor practice laws. By the way, this action reverses a decision in 2020 under the Trump administration that basically freed corporations from any liability for unfair labor practices committed by their franchisees. (That little piece of information is meant for those of you who believe that “both sides do it.”) Regulators at the Federal Trade Commission have also issued a rule banning non-competes nationwide for almost all employees. This rule hasn’t gone into effect yet pending the outcome of lawsuits filed by the US Chamber of Commerce and others. Whatever ends up happening to that specific rule, it is clear that more and more governments in our country regard the use of non-compete and no-poach agreements against low-wage workers as an unjust deprivation of workers’ freedom.

In subsequent posts we will explore several other methods corporations, their major shareholders and their political and academics allies have used to enrich themselves at the expense of most of the rest of us. Some of these other methods have hurt working and middle-class Americans much worse than the use of non-compete and no-poach agreements. We will find people defending these methods in the name of preserving and protecting freedom. I hope that this post will help you be more critical of these defenders when they trot out “freedom” language.