Sunday, May 19, 2013

A worrisome look at the new workforce



In 2013 and beyond, landing a good job no longer depends on college grades or previous job titles or personal references or professional awards.
Employers increasingly focus on potential employees who demonstrate creativity, collaboration skills and analytical thinking. And that’s just to get in the door for an interview.
Those headed out the door, facing job losses due to outsourcing or automation or new technologies, could be any of us. In the emerging 21st Century, no job is safe. Those are the new words to live by.

The days when you could graduate from college and do the same job, with the same skills, for four decades before easing into a comfortable retirement are disappearing.
Ongoing education and training will become standard procedure. And workers will be judged almost entirely on their skills and talent.
Classifying workers as blue collar or white collar is a 20th century anachronism. Authors Tom Friedman and Michael Mandelbaum devised a definition of the four types of job categories in the new century.

They are:
• Creative Creators — People who perform non-routine work in an exceptional and creative way, such as the best lawyers, doctors, writers, scientists, accountants and entertainers.

• Routine Creators — People who perform non-routine work in a routine way. This classification consists of the same professions as the Creative Creators. But these are very average workers who will increasingly be replaced by robots and microchips and computer software.
• Creative Servers — Low-skilled workers who do non-routine work in inspired ways, such as the baker who creates a special cake recipe or the nurse with extraordinary bedside skills.
• Routine Servers — Workers who do routine service work in a routine way. They are headed for extinction.
Most of us remember the days when the prospect of an ATM machine at every corner sounded like science fiction. In the coming decade, millions of workers will find themselves as outdated as the onetime service station attendant who pumped gas.

Beyond employers’ quickly shifting needs, American economic success in the 21st century will require us to return to a nation of inventors and “starter-uppers.”
Fortunately, it’s never been easier to start a company than in our high-tech, hyper-connected world. Unfortunately, our entrepreneurs are competing with better-educated scientists and engineers across the globe, particularly in Asia, who have everything they need in the palm of their hand — a smartphone, a computer tablet and access to cloud computing that lets a low-budget entrepreneur “rent” computer power that exceeds that of the largest high-tech companies of a decade ago.
About 500,000 patents for new inventions are filed in the U.S. annually, but only about half of those are filed by Americans.
We cannot allow that trend to continue.
New ventures, new ideas — that’s where future employment will be created.

After decades of corporate downsizing and outsourcing, Macomb County’s cities and townships can no longer hope that Ford or GM or Intel or Apple will come to town and build a 5,000-worker factory.
What the county needs is 100 people creating startups that employ 25 people each, and 20 people launching companies that hire 50 workers each, and five people starting companies that employ 300 people each.

In their 2011 book, “That Used To Be Us,” (which much of this column is based upon), Friedman and Mandelbaum explain that many of the biggest issues facing our nation, including the creation of new jobs, has nothing to do with Republicans or Democrats or liberal or conservative doctrine. Understanding and harnessing the future job market requires nonpartisan, practical solutions.
For example, the sale of smartphone and tablet “apps,” an industry that did not exist prior to 2006, is expected to produce revenues of $38 billion by 2015.
What the app industry looks for in its job applicants is a workforce that combines software knowledge, art, math, creativity, writing, video gaming, education, composing and marketing. In other words, creating apps requires the combined skills of MIT, MTV and Madison Avenue.

The typical business school graduate need not apply. Law school might not be a good idea either.
A California company has perfected a means of replacing lawyers with “e-discovery” software that relies upon artificial intelligence. In the past, a company facing a complex major legal battle would hire a law firm that would bring in an army of lawyers and paralegals to review hundreds of stacks of documents at a cost that could exceed $2 million.
In contrast, Blackstone Discovery’s technology can thoroughly review and analyze 1.5 million documents for less than $100,000.
Beyond the huge cost savings, there’s another obvious advantage — no mistakes. One Blackstone company official told The New York Times: “People get bored. People get headaches. Computers don’t.”

In fact, the advancements in microchips and software technology are so extraordinary, you might need to be a Ph.D. to understand it.
According to a Yale Economics School calculation, between 1850 and 2006, the inflation-adjusted cost of performing a standard computational task did not fall by 100-fold or 1,000-fold. It dropped by at least 1.7 trillion-fold. And most of those gains came in the last 30 years.
As a result, it’s no wonder that many of us, particularly our kids and grandkids, will soon be performing jobs that are not even invented yet.

The subtitle to the Friedman/Mandelbaum book is telling: “How America fell behind in the world it invented and how we can come back.”
The IT revolution was launched in the United States with the creation of the transistor and communications satellites, followed by the personal computer and the cellphone. Then things really kicked into high gear with the BlackBerry, PalmPilot, iPod, iPad, iPhone, Kindle, Skype, apps, and cloud computing. Twitter, Facebook and 4G networks are as common as TVs and typewriters were in the last century.
Yet, these American-made advances gave entrepreneurs and CEOs across the globe the “tool kit” to compete with us and remove the barriers they previously faced due to politics and geography.

Because of globalization and digitization, education is the key to the emerging economy, to our future prosperity, to our standing in the world.
Education and innovation have always been at the center of America’s formula for success. But in the 21st century, everything is moving much faster.
Public education at the elementary level was created to assist the nation’s agrarian society. At the dawn of the 20th century, only about 6 percent of teenagers graduated from high school. And only about 2 percent of those ages 18 to 24 were enrolled in a 2- or 4-year college. By the end of the century, 63 percent were headed straight from high school graduation to post-secondary education.

The next step is a nation in which the K-12 mentality is finally replaced by a K-14 approach to schooling — 12 years leading up to a high school diploma, followed by two years at a community college or vocational school. That should be the new minimum.
Without improvements to our mediocre education system, America will lose its competitive edge and repeat mistakes, such as those experienced in the solar power industry. The U.S. dominated this market just a few years ago but today roughly half of the solar panels in the world are made in China. Germany is No. 2 on the list. When Applied Materials, a Silicon Valley-based company set its sights on establishing the world’s largest commercial solar research and development center, it chose a site in China. It initially hired 330 people, one-third of whom had master’s or Ph.D. degrees.

In the 20th century — the American Century — such a facility would have been built without a second thought in the U.S. and would rely upon top-notch American scientists and engineers.
That used to be us. But we can be that again.

0 comments:

Post a Comment