Remember when elections were fun? Each candidate put on their game face and brought their best to the table in attempt to outwit one another on the campaign trail. Candidates promised voters the world — touting how they intend to help families, jobs, education and national security to name a few.
Those days are gone. The old dogs of the Democrat party, in particular, have become visibly cynical. President Biden sternly gazed over onlookers at a recent speech, warning that a vote for a Republican might as well be a vote for an election-denying political extremist. Donald Trump may out of office, yet the post-traumatic stress disorder rages on. Americans may have more pressing concerns — like how to afford their skyrocketing food, energy and housing costs — yet the MAGA-inspired fearmongering continues at MSNBC, CNN and in the pages of The New York Times and The Washington Post. The social media echo chamber does its part to amplify our dire “reality” — which sets the stage for still more self-fulfilling political prophecies of the same.
America is in a funk. For the political establishment, the culprit is not inflation, crime, yet another COVID-19 variant, diesel shortages that threaten to plunge the Northeast into a deadly winter— or even the prospect of “nuclear Armageddon” in Ukraine. The real problem? Democrats refuse to share power with Republicans.
While it is not unusual for Americans to be subjected to hefty-dose of negativity in an election year, what has changed in recent years is that social-media saturated Americans endure election-year mudslinging 24 hours a day, 365 days a year. It is enough to make anyone cynical, with a majority of Americans convinced, according to a recent poll, that Democracy is in trouble. What is more, when politicians and pundits take to social/media year-around to peddle an endless stream of alarmism, it leaves very little room to raise the ante in the run-up to an election without straying into the weeds of the absurd and downright hysterical.
If nothing more, the Midterm 2022 elections will answer the $64,000 question: Will voters take the bait?
Judging by the furrowed brows and weary looks on the faces of those who have carried the Democratic party the longest, the jig may soon be up. Take, for example, former First Lady and Secretary of State Hillary Clinton, wide-eyed and sounding a familiar alarm: Republicans, she warned, “literally have a plan to steal the 2024 election”.
Even the formerly unflappable President Obama is not immune. The Barrack Obama many of us remember in the mid 2000s carried himself with optimism, flashed a million-dollar smile and transfixed voters with his knack for oration. The Obama of 2022 hit the campaign trail on behalf of Democrats with doom and gloom on the mind. The positive attitude that carried the former president over the electoral finish line not once but twice — financial crisis notwithstanding — has been replaced with a wagging finger. Like his former vice president and secretary of state, campaigning on behalf of 2022 Midterm election candidates has been less about “bringing out the vote” as opposed to an attempt to scare up the vote.
For Democrats, fear is apparently the only tool left in the toolbox.
If the transformation of “The Big Tent” to The Big Party Poopers has left you, too, shaking your head, you are not alone.
It is a presidential election year and by all counts the race is close. There is no question the post-recession recovery has been anemic at best. To call it a recovery is a stretch and the threat of a double-dip recession lingers. Whether anyone can really turn this lackluster economy around is anyone’s guess. Talk of the unsustainable $16T deficits looms large but specifics on job creation remain few.
It’s not just abstract conversation for the nation’s unemployed and underemployed.
Culprits are a dime a dozen, among them offshoring and outsourcing — imbalanced “free trade” that in better economic times Americans were largely content to ignore in exchange for a bounty of Chinese-made bargains at Walmart.
Taxes are another favored target. What if we eliminated corporate taxes? Not much, it turns out. Many major corporations already avoid paying federal income taxes, studies show. Beyond that, there remains an insurmountable wage gap between an American worker and a similarly-qualified counterpart in the likes of Bangladesh. The argument can be made in favor of tax cuts but it won’t necessarily translate into hiring more American workers because consumer demand remains low. Ditto for unions, which represent less than 7 percent of the U.S. workforce. Until Americans have the ability to subsist on less than minimum wage, driving down pay will have little impact on leveling the globalized playing field.
The usual suspects have been the subject of much discussion but what about the not-so-usual suspects? The unemployed and underemployed may be asking “Is it just me or was it really easier to land a job before the Internet came into play?” Quantifying the answer to that question with any degree of certainty is difficult — it will take more research to answer the question definitively. And like any time past or present, some things don’t change: Success depends on where one lives, who one knows, what type of industry is hiring and one’s education or skill level. One thing, however, has changed: In the pre-Internet days, looking for a job entailed pounding the pavement nearly as much as it required pounding a keyboard. Now it’s possible to shoot off a résumé from an easy chair! Still, there are some indications that job hunting in the digital era isn’t as easy as the ease of the technology implies. Here we take a look at nine of the least-often acknowledged reasons job-seekers’ efforts — and the economy at large — continue to flounder in uncharted Information Age territory.
The World Wide Void
Social Networking: So you’ve got 300 friends and family connected on Facebook and five recommendations on LinkedIn. How many of those individuals yielded a job lead? Now how many of those leads actually panned out? Of those that panned out, how many of them were built upon a relationship or introduction that occurred principally online vs. off?
Due diligence takes on a whole new meaning in a sea of fly-by-night online universities, identity thieves posing as employers or recruiters and social networking contacts that may do as much to harm one’s reputation as to advance it.
Cultural Shift: Hiring conventions have changed — and they’re generally less tolerant of those who deviate from the norm. HR staff who witness an applicant walk into an establishment to apply to or follow up on a job may characterize the effort as “annoying” and attention-getting gimmicks on the part of applicants as “creepy“. With few exceptions, today’s one-size-fits-all employment market eschews the personalized or creative, which speaks to the jaded wariness of our over-saturated, over-stimulated, life-on-an-electronic-leash culture.
Going above and beyond to stand out above your fellow job seeker may be an asset in an old school employer’s eyes — it could also earn you the reputation of being an overly-desperate “job stalker”.
Who’s Asking?: Thanks to the anonymity of the Internet, fewer employers feel it necessary to provide a name, address or phone number. On Craig’s List and job boards alike, “company confidential” is an all-too-common practice. We’ve all heard the phrase “Don’t call us, we’ll call you” — but this takes opaqueness to a whole new level.
The irony of the Information Age is this: It has never been so easy to hide in plain sight!
The Internet: Never has a job search been more impersonal. Few jobs require job seekers to pick up an in-person application, scan the classified section of a local newspaper or interview on the spot at a job fair. In the past, looking for a job took footwork — physical effort to seek out job postings and to pick up and return applications. Now it’s so easy a caveman can do it. And that’s not good news for those who wish to stand out. Now that everything is online, everyone can apply — causing employers to be inundated, for instance, by out-of-area applicants that in the past would have had a hard time competing for a job advertised primarily in local newspapers. To compound the problem, Internet job postings more than a day or two old are rarely removed yet quite possibly defunct — over-saturated by applicants and marginally-qualified “junk” applicants at that. It’s a major reason why few Internet applicants learn whether their application is under consideration or not.
Even on the part of those who score an interview, follow-up is increasingly rare. Who would’ve thought: The dreaded rejection letter of days gone by seems exceedingly quaint and polite in the vast Internet of today.
Impress the Machine: A look at a job-seekers’ handbook written in the pre-Internet age is revealing. Much of the advice surrounds the proper way to go about standing out to gain that coveted interview — passe, even comical advice by digital-era standards. Today’s job search has been largely reduced to that of an electronic form and an MS Word doc — subjecting applicants to the great equalizer that is the Internet itself. Because of the sheer volume of applicants that the online process avails itself to, the eyes of an HR manager are unlikely to review the vast majority of applications. The initial screening process for employers is increasingly automated. A key-word locator scans an applicant’s submission. Applicant tracking software decides who makes the cut. This makes life particularly difficult on career-changers and fresh-graduates — and anyone else for whom one’s résumé contains experience or knowledge a computer application fails to interpret or appreciate.
Thanks, too, to the multiplicity of online IQ and personality profile tests designed to weed out electronic applicants before human eyes make first contact, only those who fit the company “monoculture” make the cut. Even the presence of a college degree isn’t the ticket it once was, if only because so many people possess them. Moreover, the relevance of one’s degree is itself problematic. And it’s not just a U.S. problem. Chinese graduates are coming up over-educated and empty-handed, too. This, however, is a particularly dangerous trend here in the states, where the skyrocketing cost of higher education necessitates steady work and a decent income by which to pay down student loans and move on to other goals in life, such as starting a family or buying a home.
It’s not personal. Employers are coming up with rigid requirements to fit fewer applicants because there are too many applicants and too few jobs.
Typecasting: You’ve just graduated from college and can’t find work in your field so you take a minimum wage job. Alternately, you’ve worked in your profession for a number of years but were laid off. You’ll take any job you can to get by — understandable, right? While it’s long been true that employers don’t like unexplained employment gaps what is less appreciated is the price for taking a step backward. Most employers are looking for a progressive level of responsibility — a linear career trajectory. It may not be fair that doing what you had to do in a recession to get by counts as a strike or that a weak job market made it necessary to accept work less challenging or lesser paying. And yet, for reasons described above — the sheer volume of applicants an Internet connection permits — employers are working more aggressively to thin the herd. Applicants who work too long in an unrelated field or for lesser pay and responsibility have their work cut out for them, not unlike a character actor that has accepted one too many sci-fi or daytime soap roles.
Getting out of a career rut has never been easy. Technology only makes it easier to pigeonhole job seekers who fall outside a narrowly-targeted candidate profile.
The Jobs Paradox
Technology: Technology has opened the job market to more competition than ever before. Thanks to broadband, back office and administrative work can be done virtually anywhere in the world. As automation enables greater efficiency employers are apt to invest more money in such products and services — diminishing demand for human resources.
It doesn’t take as many people to answer phones — automated systems allow callers to await the next available representative. It doesn’t take as many people to take reservations, provide directory assistance or maintain a digital archive. Records, gaming and software delivery, too, are migrating to the “cloud“. It takes fewer employees to process photos — consumers process their own photos on home computers. It takes fewer secretaries — managers can answer their own calls and track their own schedules on a PDA. There are fewer copy editors working for publishing houses and media outlets, in part because spell- and grammar-checkers in word processing applications have usurped them. There are fewer printing presses, print news journalists, news and postal delivery people employed because demand is down. And, as President Obama said on the campaign trail last year, there are fewer bank tellers and cashiers because ATMs and self-checkout kiosks are turning banking and shopping into a help-yourself job.
The Internet is a self-serve medium, enabling online retailers to edge out local competitors. Just as the expansion of the Borders and Barnes & Noble chains threatened independent booksellers, competition from the likes of Amazon is encroaching upon brick-and-mortar bookstores, of which Borders has already succumbed. And it’s not just B&M booksellers who are feeling the heat. Best Buy, which less than a decade ago edged out Circuit City among other local and national gadget retailers, has taken a whopping loss in profitability, too. What’s more, as local businesses vacate property and lay off workers, state and local tax revenues and commercial real estate values take a hit — a magnifier effect that only weakens the financial system further. And perhaps more unanticipated than anything, Dot-com bubble not withstanding, the popularity of the virtual world has yet to translate reliably to real-world profits — even with respect to popular social networking platforms. Perhaps no one knows the fallacy of Internet-as-goldmine better than the print media industry where circulation and ad revenues are down, bankruptcies are on the upswing and freelancer writers, photographers and cartoonists are increasingly working at home — part time.
It has been an article of economic faith that the displacement of one job is equaled or exceeded by growth in another area. Such assumptions do not account for jobs that are bound for cheaper overseas labor markets, nor does it account for the reality that better-paying jobs frequently demand specific skill sets for which educational programs typically lag. Moreover, not every “Joe the Plumber” can expect to have or to handle the demands of increasingly high-tech, high-skill or creatively-demanding jobs. And yet technology continues to hollow the job market out, shoving low-skill, service-level jobs on one end of the bell curve and highly educated Ph.Ds and creative/entrepreneurial-geniuses on the other. The middle class isn’t just eroding thanks to global trade. It’s eroding because technology, if not eliminating jobs outright, require a higher-than-average subset of workers to support it.
We can’t have a well-fed, well-educated and well-adjusted society, on the whole, if technology bifurcates the workforce along the lines of intellect and class.
Here Now, Gone Tomorrow: Staffing agencies have been with us for decades. What hasn’t been with us for decades is the idea that it is acceptable to farm out mission-critical business functions to outside firms and temporary contractors on a long-term or permanent basis. Nowhere, perhaps, is this more apparent than the information technology field. Increasingly field engineering is carried out by contractors and increasingly entry-level developer work has shifted overseas and undercut through H1B visa, foreign-worker insourcing. At the same time, data storage is moving off-site to cloud facilities comprised of a virtual, Internet-based distribution hub serving a multitude of corporate clients. Software licenses and their associated administration costs are shifting, too, to cloud-based subscription services.
In-house information technology support staff are proportionately at risk of replacement by a contingent, just-in-time workforce to further scale down costs. Therein lies the irony: those who work to facilitate technological efficiency — consolidation through automation — are transforming their own livelihoods into one characterized by employment gaps and benefit loss for which students and career-changers with an interest in the IT field are growing wary. When one contract expires the continuity of one’s personal income goes with it. As the tech talent pool shrinks, productivity may take on a whole new meaning as fewer and fewer technologists are employed or equipped to provide businesses with the continuity of service and security they demand.
Talented personnel cognizant of the eroding future in the IT and administrative fields are apt to apply their skills elsewhere, competing for jobs alongside the rest of us for an increasingly limited slice of the economic security pie.
Information Asymmetry: Today’s applicant selection process is more image based than ever. Applicant tracking software allows companies and recruiting firms to form impressions not merely through a criminal or credit check but who you know and, for better or worse, the prevailing stereotypes those associations invoke. Digging into an applicant’s connections on LinkedIn and Facebook are common and some employers have gone so far as to request passwords to social media sites as part of the screening process, while others collect social security numbers for all would-be contractors — not merely new-hires who make the final cut. Whether or not a candidate has engaged in inappropriate or questionable behavior is the obvious question but it is far from the only consideration. One’s age, income and the overall character of one’s social network may work for or against applicants. Today’s employer has the upper hand like never before.
It’s verboten to include a photo of yourself with your resume or thank-you card — but employers may view one anyway. Worse, there’s no reliable means to determine if you’ve been subject to discrimination on the basis of age, race or gender because of it. The Internet has made EEOC regulation virtually unenforceable.
The social and economic questions this digital era provokes remain new and largely unanswered despite the ever-present push to move forward at all costs. And yet how accurately we define the challenges determines our success in adapting to change with the smallest amount of persistent, collateral damage. Do we want to continue headlong into a technology-made future wherein employers cast an unrealistically wide net for “local” applicants? Do policymakers wish to facilitate a future where there are fewer and fewer taxable Americans because technology has hollowed out the employment market, squeezing out the middle class? Do we feel it is justified to to force applicants to “tell all” online, availing themselves to identity theft and résumé rip-offs even as employers refuse to identify themselves? Is it time to revisit the issue of antitrust enforcement to break up monopolies across a variety of industries — to prioritize more jobs, spur competition and combat market concentration?
These are the soul-searching questions our increasingly convenience-driven and complex society must ask itself. How we answer will determine whether high numbers of underemployed and unemployed workers continue to take their toll on American prosperity. So too do our answers foreshadow whether the American Dream remains within reach of those willing and able to pursue it.