How Immigration Affects the U.S. Economy – 11 Myths to Dispel


 
Immigration has long been a controversial subject for Americans, despite the country’s reputation as the world’s melting pot. In times of economic uncertainty, emotions run especially high, and partisans on both sides of the political divide use immigration controversy for their own gain.
 
Knowing what’s fact and what’s fiction is particularly tricky in the unregulated, anonymous world of social media. In order to separate the truth from our fears, it’s important to know the facts behind the issues. Here’s how immigration affects several aspects of the U.S. economy.

Immigration Myths

According to the Migration Policy Institute (MPI), there are approximately 45 million immigrants in the United States today, making up about 13.5% of the population. Immigrant children born in the country almost double the figures to 87 million and 27%, respectively. Over 80% of immigrants have lived in the country for more than five years, and almost one in three owns a home.
 
Yet while immigrants are a part of our neighborhoods, schools, and workplaces, misconceptions about them abound. Here are some of the most common.

Myth #1: Most Immigrants Come From Latin America

Many Americans believe that immigrants predominately come from Latin America by sneaking over the border. While Latin Americans accounted for 37.2% of immigrants in 2016, the composition of immigrants has changed significantly in the past half-century. In 1960, the largest immigrant groups were from Italy, Germany, the U.K., and Canada, according to the MPI. European countries accounted for almost one-half (48.5%) of the total, and the Soviet Union (7.1%) had a higher share than Mexico (5.9%).
 
In 2016, most immigrants came from Mexico (26.5%), India (5.6%), and China (4.9%). Mexico and Central American countries, including Cuba, accounted for the largest proportion of legal and illegal immigrants, but not the majority. Asia represented slightly more than 20%, with the rest of the world comprising 42.5%.

Myth #2: Most Immigrants Are Illegal

Some Americans believe most foreigners are in the United States illegally. That is not true. Illegal immigrants account for about 24.5% of the immigrant population but a meager 3.4% of the U.S. population in total, according to Pew Research.

Myth #3: Immigrants Are Unskilled & Uneducated

Some Americans assume immigrants are uneducated, unskilled, low-wage workers. However, the MPI found that one-half of immigrants have a high school diploma or higher education. Two-thirds of immigrants over the age of 16 are employed, with almost a third (31.6%) in management, business, science, and the arts, compared to 38.8% of native-born citizens.
 
It’s true that a higher proportion of immigrants (24.1%) are engaged in low-wage service jobs than native-born citizens (16.8%). However, the libertarian-leaning Cato Institute, citing statistics from the U.S. Office of Homeland Security and others, states that immigrants are “generally much better educated than U.S.-born Americans are … [and] 62 percent more likely than U.S.-born natives to have graduated college.”
 
Foreigners who work in the United States with H-1B visas have bachelor’s degrees or higher and work in specialized fields such as IT, engineering, mathematics, and science. President Trump and others have complained that H-1B visa holders compete with Americans for high-paying jobs. However, the visa program was created to allow companies to hire foreign workers to work for three years or more in specialty occupations for which there are not enough skilled Americans to fill the positions.
 
Read more . . .

Is Social Security Going Broke? Possible Solutions

More than one-half of Millennials believe there will be no money in the Social Security system by the time they are ready to retire, according to a 2014 Pew Research report. “I don’t think anyone honestly expects to Collect a single penny they pay into social security. I think everyone acknowledges that it’s going to go bankrupt or kaput,” says Doug Coupland, author of “Generation X.”
 
What went wrong? Will Social Security go bankrupt?

A Brief History of Social Security

In 1935, few of the program’s creators could have anticipated the condition of the Social Security program today. The country was in the midst of the Great Depression with a quarter of its labor force – 15 million workers – idle, and those with jobs struggled to make ends meet as their hourly wages dropped more than 50% from 1929 to 1935. Families lost their homes, unable to pay the mortgage or rent. Older workers bore the brunt of the job losses, and few had the means to be self-supporting. One despairing Chicago resident in 1934 claimed, “A man over 40 might as well go out and shoot himself.”
 
Hundreds of banks failed, erasing years of savings of many Americans in a half-decade. People lived in shanty towns (“Hoovervilles”) or slept outside under “Hoover blankets” (discarded newspapers). Breadlines emerged in cities and towns to feed the hungry. Thousands of young American men hopped passing trains, sneaking into open boxcars in a desperate attempt to find work.
 
Democrat Franklin D. Roosevelt (FDR), promising a New Deal, defeated former President Herbert Hoover in 1932 with more than 57% of the popular vote and 472 of 531 Electoral College votes. Three years later, FDR signed a bill that would “give some measure of protection to the average citizen and to his family against the loss of a job and against poverty-ridden old age.”

7 Ways to Prevent Political Arguments With Family and Friends


 
“There cannot a greater judgment befall a country than a dreadful spirit of division as rends a government into two distinct people, and makes them greater strangers, and more averse to one another, than if they were actually two different nations.”
 
So wrote English essayist and playwright Joseph Addison in 1711 of the hyper-partisanship that led to the English civil wars of the 17th century. Almost 100 years later, George Washington warned of the dangers of political parties in his 1796 Farewell Address. Despite these cautions, America still struggles with partisan politics, today more than ever.
 
Political party affiliation has become the measure we most often use to distinguish friend or foe — more defining even than race, religion, or relationship. Politics draw lines between us, creating tribes surrounded by moats of mistrust. As a result, family gatherings have become battlegrounds with each side determined to take no prisoners.
 
The first step to calm political strife between family and friends is to understand what causes extreme partisanship. Here’s a closer look at why people hold on to their beliefs so fiercely, followed by seven ways you can defuse tensions when the topic of politics crops up at your social gatherings.

The Origins of Hyper-Partisanship

A “partisan” is a member of a group that shares similar interests and goals. Political parties and partisanship have existed since the ancient Greeks and arise when people disagree with a government’s actions (or non-actions). Driven by different visions of the future, partisanship is a natural outcome of democratic government.
 
Political parties in the United States began as broad umbrellas under which members had similar, though not identical, interests and views on a majority of issues. Tolerating these differences was necessary to build political strength and win elections in the beginning, but in the two decades following WWII, both parties developed conservative and liberal wings. Inter-party battles over platforms were intense, concluding in compromised positions that few liked but the majority could accept. As a result, the final platforms of the two parties often resembled each other and left voters feeling that there wasn’t “a dime’s worst of difference between the two,” as candidate George C. Wallace, who represented the American Independent Party, famously said in the 1968 presidential race.
 
The splits within the parties also diminished the power of party leaders to force maverick officeholders to hew to the party line. Legislation, the result of cobbling together ad hoc coalitions of officeholders, was rarely extreme and reflected the trade-offs necessary for passage.
 
Read more . . .

Understanding Block Chain Technology – How It Will Change the Future


In November 2008, an anonymous author using the pseudonym Satoshi Nakamoto issued the white paper “Bitcoin: A Peer-to-Peer Electronic Cash System,” which outlined “a system for electronic transactions without relying on trust.” This system, known as blockchain, became the basis for the world’s first widely accepted cryptocurrency, bitcoin. It’s also a foundational technology that has the possibility to impact society as dramatically as the invention of the Internet itself.
 
Don Tapscott, author of “Blockchain Revolution: How the Technology Behind Bitcoin is Changing Money, Business, and the World,” claimed in an interview with McKinsey & Company that blockchain is “an immutable, unhackable distributed database… a platform for truth… a platform for trust.” An unapologetic, enthusiastic supporter of blockchain, he adds, “I’ve never seen a technology that I thought had a greater potential for humanity.”
 
Is the hype around blockchain justified? Let’s take a look.

The Dangers of Digital Transactions

Mutual trust is the basis for business transactions. Yet as society has grown more complex, our ability to trust another party — especially if they’re unknown and halfway around the world — has decreased. As a result, organizations develop elaborate systems of policies, procedures, and processes to overcome the natural distrust arising from the uncertainties of distance, anonymity, human error, and intentional fraud.
 
At the heart of this distrust is the possibility of a “double spend,” or one party using the same asset twice, particularly when the assets being exchanged are digital. When exchanging physical assets, the transaction can only occur at one time in one place (unless forgery is involved). In contrast, a digital transaction is not a physical transfer of data, but the copying of data from one party to another. If there are two digital copies of something for which there should be only one, problems arise. For example, only one deed of the ownership of a house should be applicable at a time; if there are two seemingly identical copies, two or more parties could claim ownership of the same asset.
 
Unfortunately, the systems and intermediaries required to ensure, document, and record business transactions have not kept pace with the technological changes of a digital world, according to Harvard Business Review.
 
Consider a typical stock transaction. While the trade — one party agreeing to buy and another party agreeing to sell — can be executed in microseconds, often without human input, the actual transfer of ownership (the settlement process) can take up to a week to complete. Since a buyer can’t easily or quickly verify that a seller has the securities the buyer has purchased, nor can a seller be confident that a buyer has the funds to pay for that purchase, third-party intermediaries are required as guarantors to ensure that each party to a trade performs as contracted. Unfortunately, these intermediaries often add another layer of complexity, increase costs, and extend the time it takes to complete the transaction.
 
Our existing systems are also vulnerable to intentional attempts to steal data and the assets they represent. International Data Corporation reports that businesses spent more than $73 billion for cybersecurity in 2016 and are projected to exceed $100 billion by 2020. These numbers don’t include security expenses for non-businesses or governments, the cost of wasted time and duplicated efforts due to data breaches, or the expense of any remedies to those affected.
 
Blockchain technology presents a remedy for these issues that could significantly alter the way we do business in the future.

How Blockchain Technology Works

Understanding blockchain requires an understanding of “ledgers” and how they’re used. A ledger is a database that contains a list of all completed and cleared transactions involving a particular cryptocurrency, as well as the current balance of each account that holds that cryptocurrency. Unlike accounting systems that initially record transactions in a journal and then post them to individual accounts in the ledger, blockchain requires validation of each transaction before entering it into the ledger. This validation ensures that each transaction meets the defined protocols.
 
Read more. . .