From Foraging to Modern Technology: The Evolution of Life and Work

Woman creating life-work balance

Our relationships with “life” and “work” are complicated.

This is clear from the evolution of the term “work-life balance.” Countless variations have made headlines during the past four decades: work-life negotiation, work-life integration, work-life synergy, and more. Our concepts of life and work evolve, and our technology needs to evolve with it.

Never has this been more apparent. As our world reels from a global pandemic amid economic, social, and environmental uncertainty, business and HR leaders are exhausted. So is everyone else. A mountain of research suggests life-work dissatisfaction played a leading role in the Great Resignation, crippling many companies and industries as they’ve struggled to operate and provide the salaries the “market demands.”

And so, while we certainly need technology tools to attract and retain talent for this current crisis, the truth is that we need much more than that. We need solutions that are truly people-focused, that are life-aware, that support team members holistically both at work and in their personal lives. If history has taught us anything, it’s that technology in service of business often comes at the expense of people. The time has come to refocus technology in service of people, which positively impacts the business.

Most business leaders don’t consult macro-level historical trends when making key decisions. As humans, we’re also prone to a recency bias, believing that everything has always been as it is now. It’s easy to forget that there is a fascinating history to this thing we call work. So how did the modern workday come about? And what have we learned about work from these key moments in history?

Ancient History

For most of modern humankind there were no “jobs.” People worked, of course; survival required hunting, foraging, outrunning predators, and constantly moving as part of a nomadic lifestyle. But these everyday responsibilities were basically “life.” Current anthropologic studies show the average male spent 5.3 hours per day hunting and foraging, tasks that were largely leisurely and communal in nature. (Hunting is about 10 parts watching and walking to one part chasing and killing.) These cultures give us a pretty good glimpse into what this elusive “balance” looked like for our ancestors.

This changed when the Neolithic Revolution began. Over the course of hundreds if not thousands of years, humans learned to farm by domesticating plants and animals for food. This promise of a reliable, stationary food source allowed for permanent settlements, and populations grew quickly.

It is not an exaggeration to say the invention of agriculture changed everything, including introducing the concept of “work.” Work became more than a combination of daily tasks ensuring survival; people now had specialized jobs that distinguished them from their peers and provided skills or products that could be bartered or sold. People began working longer hours than they had in their hunter-gatherer days, and the first seeds of social class division and wealth disparity were sown. Thousands of years later, this segregation led to the “working class,” the “elite,” and enslaved labor. For those in the middle—the more affluent working class, like shop owners and artisans—it’s estimated that many worked a 6-hour day.

The Industrial Revolutions
 

Fast forward to 1536 A.D., and French theological John Calvin began preaching what is now known as the “Protestant work ethic.” As his teachings caught on, hard work was viewed not as a sign of lower social standing but as a way to commune with God, to prove morality and worthiness. At the time, the most significant economic activities in Europe (and eventually the New World) were small-scale farming and artisan handicrafts, which kept wealth disparities somewhat in check. Most people worked for themselves, and farming was by far the most common profession.

Things continued in this way until society was uprooted once again during the Industrial Revolution, which began in the mid-1700s. As technology and industrialization advanced, so did the amount of time the average person spent working each day.

The advent of industrial development promised to change the world as much as or more than agriculture had, revamping patterns of human settlement, labor, and family life. Large-scale mechanized agriculture equipment eventually replaced peasant farmhands and wiped out smaller farms, obliterating the primary profession of the poor, rural majority. Desperate for work, families flocked to industrial towns and cities, hoping for a better life working for the newly developed factories and mines. For the first time in U.S. history, most Americans became wage earners (people who worked for someone else).

This explosion of industrialization and urbanization in Britain, Western Europe, and the United States created a new class of wealthy entrepreneurs. Industrial capitalism also led to the need for more “white-collar” administrative and clerical workers, establishing a comfortable middle class in the process. The success of both was largely predicated on predatory employment practices for the “blue-collar” working class, which was mainly made up of immigrants and arrivals from farms and small towns.

This period is infamous for its incredibly dangerous, grueling working conditions, where employees regularly worked labor-intensive, repetitive shifts for 12-to-16 hours per day, six days per week. While manufacturing companies enjoyed great profits, keeping wages as low as possible was key to profitability and industrial workers were barely paid enough to cover the cost of living. It’s estimated that, by 1912, industrial accidents killed 35,000 workers each year and injured two million others.

Unions and the Fair Labor Standards Act

During the Industrial Revolution, workers had no rights or leverage at all. There was no minimum wage, no limit to working hours, no worker’s compensation. High levels of immigration, combined with the depressions of the 1870s and 1890s, meant economic insecurity and unemployment were rampant. Labor unions formed, demanding more pay and fairer treatment, but their strikes and protests were often unsuccessful because these workers were simply replaced, and judges and police openly and sometimes violently partnered with employers to repress dissent.

There were, however, a growing number of largely middle-class and elite Americans (known as “Progressives”) who were vocally determined to cut down on the corruption and predatory practices of the time. Notably, some corporate leaders experimented with innovative “welfare capitalism” systems, providing employees with desirable benefits and offerings to improve satisfaction and discourage unionization. This was “Employer of Choice,” 1900 edition, and these efforts were typically quite successful.

But everything changed during World War I. The federal government quickly realized that the booming wartime economy required peaceful, productive, and happy employees, so it began facilitating the formation and growth of unions. By 1920, an estimated four million workers were active union members (a figure that would balloon considerably for many more decades). Women and Black people suddenly had new opportunities as white men were drafted into the military. Things were looking up.

And then there was the crash. The early years of the Great Depression saw a peak of 25% unemployment; wages fell up to 75%; one-third of farmers lost their land and industrial production declined by over half. When President Franklin D. Roosevelt was elected, he attempted to pull Americans out of the depression by providing greater economic opportunity and support to the masses. His New Deal improved the lives of millions of citizens while ensuring the right to unionization.

Perhaps most notably for this discussion, in 1938 Roosevelt signed the Fair Labor Standards Act(FLSA), regulating a 40-cent-an-hour minimum wage, a 40-hour maximum work week, and a minimum working age of 16. The labor standards board was also born, with the expectation that the board could authorize higher wages and shorter hours when appropriate.

Modern History

It was World War II that finally put an end to the Depression, returning the economy to a state of “full employment” and solidifying the importance of a strong and healthy labor force. Women and minorities were once again temporarily welcomed into the industrial workforce, setting the stage for the Civil Rights and Feminist movements.

Americans emerged from World War II with higher wages, greater job security, and a better standard of living than any previous generations had enjoyed. Unions played a major role in this, and their successes created a “ripple effect” that ultimately raised the wages and standards of living even for non-union workers. For the first time since the Agricultural Revolution, people were working fewer hours each week than their parents had.

Continued industrial and technological advancement led to more college-educated corporate employees, and by 1960 there were more white-collar than blue-collar employees in the U.S. workforce—an upward trend that’s continued for the past 60 years. In most cases, these “knowledge-based” jobs were (and are) unfairly viewed as superior; employers typically invested more in these workers, offering better benefits and salaries in addition to less physically demanding work.

The Role of Technology & the Rise of Work-Life Balance

This brings us to recent history—a time many of us have lived through. In the ’80s, President Ronald Reagan’s pro-business, pro-individualism message applauded the singular pursuit of career and financial success. Consumerism was fully taking hold. Corporations were changing their business models and shifting toward shareholder primacy, encouraging longer working hours to increase the value of the business at all costs. In many industries, long working hours—much longer than the 40-hour work week mandated for hourly positions 50 years earlier—were required, especially for highly paid or leadership positions.

Perhaps most importantly was the invention of personal technologies such as cell phones, computers, and email, which allowed employees to “stay connected” to work even when they weren’t at the office or work site. The Protestant work ethic still held strong, and many Americans earned the stereotype of stressed-out, exhausted workaholics.

It’s no surprise then that the term “work-life balance” was supposedly coined in the mid-1980s and exploded in usage during the 2000s, 2010s and 2020s. Despite the fact that most Americans enjoy working conditions their grandparents and great-grandparents could only dream of, many people felt (and feel) burnt out, overworked, and out of balance. It’s worth remembering that even these “Golden Age” working conditions are much different than what our Homo sapiens species evolved to do.

Just as in previous eras, technological advancement promised to increase effectiveness while decreasing effort, but in many cases had the opposite impact. And this has proven to be to the detriment of both people and organizations (although, it doesn’t have to be this way.)

The COVID-19 pandemic forced people to take a step back, for perhaps the first time in their lives, and re-evaluate their priorities. When the act of leaving our homes became synonymous with risking our family’s lives, we saw clearly what our priorities were—and for many Americans, “living to work” didn’t cut it.

Additionally, many industries shut down and rapidly laid off workers, heightening unemployment during the first six months of the pandemic. But as federal support and job dissatisfaction grew, not to mention a mass exodus of more than eight million women who left the workforce to care for their children and other family members, a historic war for talent began. Never before had we experienced sustained resignations of such significant populations of our workforce. The Great Resignation is also unique in that it continues to impact all areas of the economy—blue-, white-, and the emerging gray-collar workforce. Even heading into economic uncertainty, hiring and retention challenges remain at the top of most CEO and CHRO’s concerns.

The Case for Life-work Technology
 

History and recent events prove that the case for culture and taking care of people has never been stronger. Leading businesses are looking for tools and technologies designed with people at their center, serving them and improving their lives rather than simply managing transactions and processes.

At UKG, we call this Life-work Technology, and we believe it is different from all other forms of HR, payroll, HCM and workforce management systems. Life-work Technology combines advanced, data-driven work systems with authentically human people systems—tools designed to give leaders a deep understanding of what’s driving their team’s work patterns, behaviors, and aspirations. Rather than solely designing tools for workplace efficiency, it’s solving problems by allowing people to become their best selves and, in turn, contribute their best work. These tools understand that each employee is unique, with vastly different circumstances and needs, and support them through a variety of intelligent and automated ways, leveraging artificial intelligence and machine learning to automate the most appropriate choices for individuals and the business.

Life-work Technology is our chance to be on the right side of history—and to return to the balance our species was designed for.

Curious what this looks like for your organization? Here’s a case study of what Life-work Technology looks like in practice in three of the most challenging industries today: healthcare, manufacturing, and retail.