- Future Of Petroleum Framed By Politics And Economics
- Environmentalists Hit BOEMRE With Suit Over Gulf Permits
- Green Energies Clash And Suffer Under Renewable Mandate
- Is High Unemployment Destined As The New Normal?
- People Starting To Connect Dots Of Renewable Energy Cost
- U.S. Hurricane Forecast Explains How 2011 Could Be Worse
Musings From the Oil Patch
June 21, 2011
Note: Musings from the Oil Patch reflects an eclectic collection of stories and analyses dealing with issues and developments within the energy industry that I feel have potentially significant implications for executives operating oilfield service companies. The newsletter currently anticipates a semi-monthly publishing schedule, but periodically the event and news flow may dictate a more frequent schedule. As always, I welcome your comments and observations. Allen Brooks
Future Of Petroleum Framed By Politics And Economics (Top)
Nearly two months ago when the Middle East was slowly coming off the boil politically, The New York Times’ Pulitzer-prize winning columnist Thomas Friedman wrote an opinion article entitled “End of Mideast Wholesale” in which he described how the Arab spring had changed the dynamics of peace in that region. He equated the changes being brought on the region as the equivalent of the cost increase that comes when one can no longer buy goods wholesale but rather has to pay retail prices. As Mr. Friedman wrote. “If you look into the different ‘shop’ windows across the Middle East, it is increasingly apparent that the Arab uprisings are bringing to a close the era of ‘Middle East Wholesale’ and ushering in the era of ‘Middle East Retail.’ Everyone is going to have to pay more for their stability.”
His primary example was Israel, which has enjoyed 30 years of peace with Egypt by only having to have made peace with one man – Hosni Mubarak, Egypt’s president. With the ousting of Mr. Mubarak, Israel now will have to make peace with the nation’s 85 million Egyptians. Thus, from wholesale (Mr. Mubarak) to retail (85 million Egyptians) the economics of peace in the Middle East are changing.
It seems to us that this shift is also underway in the energy sector and for commodities in general, although the change may not be quite as obvious as Middle East peace. What is driving the change in the energy business is the shift from high-quality rock to lower-quality rock. It is most pronounced in the explosion of gas shale drilling around the world and the move by the oil industry to exploit more of the heavy oil deposits globally. The shift has been helped by improvements in horizontal drilling and hydraulic fracturing technologies. But at the end of the day, the energy business is shifting from lower-cost, more prolific oil and gas resources to higher-cost, less prolific deposits. This statement would seem to fly in the face of the oil and gas industry’s trumpeting of the high initial flow rates from gas and oil shale wells, but those successes are largely due to exposing more of the formation to a wellbore and then applying many more fracturing treatments. So while these wells flow at much higher initial rates, they also experience very rapid declines and the wells are extremely expensive to drill and complete. We are treated to a plethora of claims of very low economic costs for these shale wells, but most of the estimates are suspect because almost every announcement involves some creative accounting obscuring their true cost.
The economics of heavy oil are easier to see. Over time, even with technological improvements, the cost of developing these heavy oil resources has escalated as the rock quality has declined. The saving grace supporting the exploitation of the resource has been the jump in global crude oil prices, helped partly by the uncertainty of supply due to the Middle East unrest and the accelerating demand from developing economies.
One cannot ignore the impact of political instability in the Middle East and how the Western world is adjusting to that change. The existence of high unemployment, especially among the younger population, economic hardship, ongoing recession, lack of freedoms and the widespread abuse by security and intelligence organizations in many of these Middle East countries have existed for a long time, but with the aid of social media, the people have been able to rally support to highlight these problems and to demand substantive change. The Western world has watched in amazement at this change and has encouraged the shift in social attitudes. While the outcomes will produce changes in governmental structures and leadership, they will also translate into broader control over a country’s natural resources, in most countries the source of its wealth.
A particular troubling aspect of this change in the social fabric of Middle Eastern countries is that the stability of Saudi Arabia can no longer be taken for granted. As we wrote in our last Musings, the Accelerated Transformation Program at Saudi Aramco dovetails with the Kingdom’s latest efforts to form alliances with other Middle East Muslim countries to counter the growing militancy of Iran. Some of the tension in the relationship between Iran and Saudi Arabia played out in the recent OPEC meeting that for the first time ended without an agreement on production quotas for the organization. Saudi Arabian Oil Minister Ali Al-Naimi met with the media to express his outrage over how the meeting ended and to announce that his country would insure that the world had an adequate supply of oil to prevent crude prices from soaring. However, as the country announced it would boost daily production from 8.5 million barrels per day to 10 million, Saudi Aramco quietly reduced its price discount thus lifting its profitability while at the same time appearing to be a hero in pumping more oil into the marketplace.
To understand the impact of the changes underway in the oil market, we turned to a strategy article written in April by the long-term value-oriented money manager, Jeremy Grantham of GMO. The article was titled, “Time to Wake Up: Days of Abundant Resources and Falling Prices Are Over Forever.” The article began with a summary section that highlighted a number of key points. Quoting several of these points demonstrates how Mr. Grantham views the future course of world economies and commodities.
“Until about 1800, our species had no safety margin and lived, like other animals, up to the limit of the food supply, ebbing and flowing in population.
“From about 1800 on the use of hydrocarbons allowed for an explosion in energy use, in food supply, and, through the creation of surpluses, a dramatic increase in wealth and scientific progress.
“The rise in population, the ten-fold increase in wealth in developed countries, and the current explosive growth in developing countries have eaten rapidly into our finite resources of hydrocarbons and metals, fertilizer, available land, and water.
“The problems of compounding growth in the face of finite resources are not easily understood by optimistic, short-term-oriented, and relatively innumerate humans (especially the political variety).
“Statistically, most commodities are now so far away from their former downward trend that it makes it very probable that the old trend has changed – that there is in fact a Paradigm Shift – perhaps the most important economic event since the Industrial Revolution.”
To demonstrate his point, Mr. Grantham used a few charts. The first was the growth of the world’s population from 1500 to 2000 along with a series of projections by the United Nations. The chart and its projections show an exploding global population that will put serious strains on our finite and renewable resources. Recent articles from leading demographers, however, are making the case that the world’s population growth rate is slowing and will not hit the 11 billion people the UN is projecting.
To understand the trend in commodity prices, GMO created an equal-weighted index of the most important 33 commodities. As shown in Exhibit 1 below, from 1900 to 2002 the average annual price decline of this index was 1.2%. This decline in the cost of commodities certainly contributed to the increased wealth and well-being of citizens in our economy. During this time, we were continually digging lower grade ores and drilling for oil and gas in more challenging formations, but technology enabled us to lower prices by over 70% in real terms. It is important to understand that this record may have been a historical accident. Whatever the
Exhibit 1. World Population Growth Exploding
increase in the marginal cost was, it was offset by a greater increase in productivity, which may have been the favorable result of our particular balance between supply and demand. Imagine what would have happened if the marginal cost rose faster and productivity improved more slowly. Also, demand could have risen faster, and this is the challenge we are dealing with now.
Exhibit 2. Commodity Price Decline Has Reversed
Prior to 1995, there was essentially no difference in aggregate growth between the developing world and the developed world. Since then we have seen this balance in growth rates shift to where today it is at 3-1 in favor of developing economies. Mr. Grantham showed that for the 102 years up to 2002, the price of nearly every commodity had reached all-time lows except for oil that had changed direction in the early 1970s. One of the critical factors that contributed to this change in direction for commodity prices was the emergence of China and its high capital spending, which was over 50% of its gross domestic production. That was a capital investment level never reached by any economy in history and it was reached by a wide margin.
Exhibit 3. Cheap Oil Supplies Replace By Expensive Ones
In focusing on crude oil production, Mr. Grantham recites the story of Shell geologist M. King Hubbert who predicted that U.S. output would peak around 1970. In fact, U.S. oil production did peak in 1971 and began a slow but steady decline until the past couple of years when new production from the Bakken formation in North Dakota boosted overall output. On a global basis, conventional oil production peaked in 1982 and only the more expensive offshore and unconventional crude oils have enabled the world to boost its overall output. The history of new conventional oil discoveries and production growth demonstrate the challenge the global oil industry has been facing for a long time. Mr. Grantham then shifts to a discussion of how crude oil prices peeled off in the early 1970s from the declining trend in prices exhibited by the other commodities. He believes this historical pattern is what we are now experiencing with all other commodities. Unfortunately, he gets some of his facts wrong, but the essence of the price change phenomenon is correct.
According to Mr. Grantham’s version, OPEC was formed in 1974 and it was able to boost prices because of rising demand and a finite resource. The date of the formation of OPEC is wrong – it was founded in 1960. What happened in 1971 was that U.S. oil production peaked and pricing power shifted from the Seven Sisters to OPEC. This shift in who would supply the world’s incremental barrel of oil to OPEC enabled some of its member countries to use its output as a political weapon. That pricing power, given extremely strong demand growth, resulted in oil prices being lifted from about $2 per barrel to $12 in a matter of a few years. At the end of the 1970s, when the Iranian revolution occurred and its production was removed from the market, the world was physically short of supply and oil prices soared from the $14 a barrel level to $32. That price
Exhibit 4. Conventional Oil Discoveries In Decline
jump inflicted huge economic damage via inflation and sharply higher interest rates resulting in a major global economic recession.
Exhibit 5. Oil Prices Reflected Paradigm Shift Early
For Mr. Grantham, the movement in global oil prices reflects what is going on in the commodities market as demand is forcing the world to deal with rising marginal costs from having to seek lower quality supply sources. In the preceding chart (Exhibit 5) of global oil prices in 2010 dollars from 1985 to early 2011, Mr. Grantham showed how oil prices moved within one and two standard deviations of the average price for 1985-1972. That was a period of remarkable price stability with low volatility. He then calculated the same mathematical relationships for oil prices for 1973-2005 and for 2006-2011. Based on the latest data, oil prices in their extreme volatility (two standard deviations) will range between about $37 on the low side and nearly $180 a barrel on the high. Within one standard deviation, the price range is $50 to $120, remarkably similar to the trend in oil prices for the past two and a half years.
What the oil price chart demonstrates is that when fundamentals change and the industry is forced to seek incremental output from lower quality rock, the cost level is dramatically altered. We may see many geopolitical and economic factors at work in the short-term that make us think that either high or low prices are merely transitory, but over the long-term fundamentals rule. Unless we stop using crude oil (not likely), what goes on in the Middle East or in China’s economy or in the developed world matters little. We are destined to be confronted by ever higher oil prices. These prices may be more volatile in the future due to the geopolitical and economic factors cited above, but higher commodity prices will be our constant and uncomfortable companion.
Environmentalists Hit BOEMRE With Suit Over Gulf Permits (Top)
A little over a week ago, two lawsuits were filed against the U.S. Department of the Interior’s Bureau of Energy Management, Regulation and Enforcement (BOEMRE) over its approval of a plan of exploration filed by Shell Oil, the U.S. subsidiary of Royal Dutch Shell (RDS.A-NYSE), for a development in the deepwater Gulf of Mexico. The two lawsuits were filed with the 11th Circuit Court of Appeals in Atlanta, Georgia. One lawsuit was filed by the Defenders of Wildlife, the Center for Biological Diversity and the Natural Resources Defense Council while the other was launched by Earthjustice, the Gulf Restoration Network, the Florida Wildlife Federation and the Sierra Club. Both suits, in varying degrees, allege that the approval of the drilling plan by BOEMRE was done in violation of the National Environmental Policy Act and several other federal laws including the Endangered Species Act, the Marine Mammal Protection Act, the Magnusen-Stevens Fishery Conservation and Management Act, the Oil Pollution Act and the Outer Continental Shelf Lands Act. The complaints are based on the position that the agency failed to perform environmental studies and consult with various other agencies and bodies to ensure that the granting of the permits would not harm marine life or the environment of the Gulf.
On March 31, 2011, BOEMRE deemed submitted a supplemental exploration plan by Shell for the drilling of five new wells and the sidetracking of another well along with the four existing wells, in the company’s Appomattox development located 72 miles off the Louisiana coast. The development lies in parts of Mississippi Canyon Blocks 348, 391 and 392 located in water depths of 7,160 to 7,259 feet. The approved plan of exploration represents only a blueprint for Shell’s development and each well to be drilled and sidetracked would still need its own permit.
As part of its plan of exploration, Shell disclosed its estimate for the worst case spill that could occur from the development. Shell calculated that if the “C” well, located in Mississippi Canyon Block
391, leaked, it could require 128 days to cap and might release 45 million barrels of oil during this time. Shell also disclosed that it has contracted with the Marine Well Containment Company (MWCC), a consortium of five very large oil companies operating in the Gulf of Mexico, including Shell. MWCC has constructed and demonstrated successfully an oil well spill containment system for use in the deepwater Gulf of Mexico. The system was upgraded, and approved by BOEMRE, for use in wells drilled in up to 10,000 feet of water depth. The system can contain a well with a daily flow rate of up to 60,000 barrels, which MWCC expects to increase to 100,000 barrels per day in near the future.
One of the problems for Shell is the impact of its worst case spill scenario. If one does the math – divides 45 million barrels of oil by 128 days – the well’s uncontrolled flow rate is somewhere around 350,000 barrels per day, well in excess of the existing containment system’s capability and even the proposed upgraded one. We believe it was this disclosure that helped trigger the lawsuits as the volume of oil that could be spilled is nearly tenfold greater than the volume of oil that leaked from BP’s (BP-NYSE) Macondo well.
Why could these lawsuits become a problem for the domestic deepwater offshore industry? The thrust of the complaints are based on the multi-sale Environmental Impact Statement (EIS) prepared for central Gulf lease sales that occurred under the 2007-2012 five-year Outer Continental Shelf (OCS) program. BP’s Macondo well was located on a block obtained from one of these lease sales, as are Shell’s blocks. The multi-sale EIS prepared for these sales stated that the worst case oil spill volume that could occur would be 26,500 barrels. There actually was a range stipulated of between 5,500 barrels and 26,500 barrels.
Now that the industry, and the government, has experienced the Macondo spill of 4.9 million barrels, even though that number is in dispute, there is proof that the EIS projections of the possible worst-case impact on marine mammals and the coastal region has grossly understated the potential damage. Therefore, the lawsuits contend, the Macondo spill represents “new circumstances or information relevant to environmental concerns” that triggers BOEMRE’s duty to prepare a supplemental multi-sale EIS and undertake its required consultations with the various agencies and environmental groups that form part of the updating processing. The suits say BOEMRE should not do anything until then, so the granting of the drilling plan approvals is in violation of the laws.
This will be an interesting case to watch because if the government loses, a big question will be how aggressively it carries out appeals. For an anti-fossil fuel administration, having their “modified but accelerated” permit-granting process ruled deficient would not be the worst outcome. Their claim to the offshore oil and gas industry would be: We tried to put a new plan in place that helped restart the offshore drilling business, but the courts made us prepare a supplemental multi-sale EIS, so we’re sorry we have to shut down deepwater drilling for the next year.
Last fall in a presentation at the National Ocean Industries Association meeting, David Lawrence, executive vice president of exploration for Shell, made the comment that the requirement for preparation of a supplemental multi-sale EIS would be the worst outcome for the industry. His assessment was based on the preparation time needed for a supplemental EIS. He estimated that the time needed for a new EIS could consume the better part of a year. Barely three weeks following the Association meeting, on November 10, 2010, BOEMRE issued notice of its intent to prepare a supplemental EIS for Western Planning Area Lease Sales 218 and 222, the two remaining Gulf of Mexico sales under the 2007-2012 five-year OCS program. That notice would seem to imply that BOEMRE knew it needed an updated EIS to hold future sales. Does that mean it also needs to update the prior EIS to approve new drilling permits on leases obtained under the old EIS? BOEMRE Director Michael Bromwich has said in various speeches and presentations that he is hopeful the agency’s work will survive legal scrutiny. So far the agency has a spotty court record if one considers its losses during the deepwater drilling moratorium. We may soon find out whether Director Bromwich’s “hopes” prove well-founded or are merely political wishes.
Green Energies Clash And Suffer Under Renewable Mandate (Top)
The West Coast, home of many green energy mandates, has discovered that too much of a renewable fuel can create problems for other renewables. Someone has to suffer, but “woe is me” is not a solution. The issue is what to do about an abundance of cheap hydroelectric power and a surplus of expensive wind power. The battle also involves federally-protected fish. Just how can a power company balance all these constituencies while not violating a law?
In recent weeks in the Pacific Northwest, the Bonneville Power Administration (BPA) was confronted with a temporary oversupply of hydroelectricity. This was due to the Army Corps of Engineers ordering increased river flows to maintain space in upstream reservoirs to allow for further runoff from the largest Northwest snowpack since 1997. The Corps action was dictated by its need to balance water levels and water qualities against a federal requirement to protect salmon and steelhead fish.
Rising water runoff pushed dissolved gas levels at most of the eight federal dams on the Lower Snake and Columbia Rivers above 120%, exceeding Washington and Oregon state water quality standards and threatening protected salmon and steelhead fish. Reducing hydroelectric generation in such a situation would send more water through the spillways and could push gas levels higher
Exhibit 6. Bonneville Power’s Territorial Scope
for longer periods, further endangering the fish. The Corps of Engineers, Bureau of Reclamation and BPA are required under a court order to manage spill levels to protect the endangered fish.
To deal with this requirement, BPA issued an advisory that it would temporarily limit the output of non-hydroelectric energy, including fossil fuel and other thermal power generation and wind energy. BPA initially limited all coal, natural gas and other thermal power generation to minimum levels required for electric grid stability and safety. It then limited approximately 250 to 300 megawatts of wind power generation, while still receiving wind power from about 3,000 megawatts of generation capacity. The estimated amount of wind power foregone was approximately 1,400 megawatt-hours. To put into perspective the significance of the wind power restriction, it is equal to about one-third of the generating capacity of the Bonneville Dam.
On May 13th, BPA signed an “Environmental Redispatch” Record of Decision. The policy states that during times of high hydroelectric generation, lower electric demand and high wind power generation, BPA could curtail wind generation without any compensation to wind project owners. An official from the American Wind Energy Association says the lost wind generation from the broken wind contracts has cost the wind farm owners millions of dollars in income. Wind power output in the BPA territory has increased tenfold over the past decade from 600 to 6,000 megawatts. Much of this power is sold to utilities in California or elsewhere outside of the BPA territory.
The problem is that BPA needs to keep a certain amount of stable power generation – primarily from fossil fuel plants – in order to assure that its service and transmission outlets are not damaged by the variability of wind generation. The reality of operating a system, while not penalizing customers, seems to be lost on critics of BPA.
Representative Earl Blumenauer (D-OR) said, “The actions that the Bonneville Power Administration took are in direct conflict with the stated goals of the Department of Energy, the Obama Administration and many key energy policy leaders.” How about the requirement to not violate a court order, or to harm its customers?
But then we have Pat Ford, executive director of Save Our Wild Salmon, who said that BPA’s actions and its statements were wrong. “In general, wind generation and salmon generation are complementary objectives, not competing objectives,” he said. The management of BPA is caught in an intractable situation: How to integrate the region’s growing supply of intermittent wind power without compromising the reliability of the grid or transferring large new costs onto BPA customers who don’t use the wind power?
Wind farm developers, however, believe they have a solution. BPA should be paying utilities outside the region to shut off their own fossil fuel powered plants and take the excess hydroelectric power rather than curtailing wind power. The problem is that this strategy would inflict additional costs on BPA customers, none of whom are using the wind power. As Brian Skeahan, general manager of Cowlitz Public Utility District in Lewis County in southwest Washington state puts it, “The cost allocation issues matter. We need some conversation about who pays.” This is especially true when energy sources driven by mandates and supported by subsidies clash and innocent power companies and customers are caught in the middle.
Is High Unemployment Destined As The New Normal? (Top)
The nation struggles with a current unemployment rate of 9.1% of its 153.7 million workers. This means that there are 13.9 million unemployed workers in May. The unemployment rate increased from 9.0% in the prior month and from 8.8% in April. Compared to a year ago, the employment situation has improved from an unemployment rate of 9.6% and there are one million fewer unemployed people. So while the nation’s labor market has improved over the past 12 months, it has deteriorated over the past 90 days, and various economic data series reflect further weakening of the economy suggesting the labor market may not improve soon.
Based on the household survey data, the Bureau of Labor Statistics (BLS) reported that in May adult men experienced an unemployment rate of 8.9%, adult women had a rate of 8.0% and nearly one quarter of the nation’s teenagers were unemployed (24.2%). On a racial basis, American whites had an unemployment rate of 8.0%, while blacks were at 16.2% and Asians at 7.0%.
The problem with the labor market is the large and growing number of workers who are experiencing long-term unemployment. According to the BLS, in May those workers identified as long-term unemployed (out of work for 27 weeks or more) increased by 361,000 from April to 6.2 million. This group represents 45.1% of all those unemployed. Another highly negative category is the 8.5 million workers identified as part-time workers who either had their full-time work hours reduced involuntarily or could only find part-time employment despite desiring full-time work. Of these under-employed workers, 70% are victims of slack business conditions while 30% could only find part-time employment.
The bottom line is that the labor market is extremely weak. Data from the establishment survey showed that nonfarm payrolls increased by only 54,000 employees compared to an average monthly gain of 220,000 employees for the prior three months. The months of March and April had their prior estimates of employment gains revised down. The March job gain of 221,000 was lowered to only an increase of 194,000 employees. In April, there was a reduction of 12,000 employees from the initial report of an increase of 244,000 jobs.
In addition to the rise in the unemployment rate in May and the extremely low gain in new jobs added, average hourly earnings for all employees on private nonfarm payrolls went up by only 6-cents to $22.98 per hour, or a gain of 0.3%. Over the past 12 months, average hourly earnings have only increased by 1.8%, which after the 3.6% increase in consumer prices means workers have seen their real hourly earnings reduced by 1.8%.
Exhibit 7. Unemployment Time Is Lengthening For Many
Source: Agora Financial
As shown in the above chart, the increase since 1948 in the average duration of unemployment was at a modest upward rate until the recent financial crisis in 2008 that caused the 2009 recession. At that point, the average duration began to soar. Despite the ending of the recession nearly two years ago, unemployment continues to remain high and average duration continues to lengthen. We began thinking about whether there were any other factors influencing the weak labor market and the rise in the number of workers on long-term unemployment. We came across a book: The Lights In The Tunnel – Automation, Accelerating Technology and the Economy of the Future by Martin Ford. This book was written by a developer of computer software and who runs a high-tech business.
In the introduction to his book, Mr. Ford writes that he spends a lot of time thinking about computer technology and he began to focus on how technology and economics intertwine. He began to believe the impact of technology on economics may have contributed to the severity of the recent downturn and may be hurting the pace of the economic recovery. As he pointed out, computer people have been speculating for a long time on the likelihood that computers someday will approach and even surpass human beings in general capability and intelligence, and this will have a negative impact on employment.
According to Mr. Ford, at a 2007 computer industry conference, Larry Page, co-founder of Google (GOOG-Nasdaq) said, “We have some people at Google [who] are really trying to build artificial intelligence and to do it on a large scale. It’s not as far off as people think.” Roy Kurzweil, a well-known investor, author and futurist, states categorically that he expects computers to become at least as intelligent as humans by the year 2029.
If computers and robots are going to become more capable and flexible, it is likely that the employment market will be the first to be impacted. The reason why this will happen, according to Mr. Ford, is because computers and robots become more attractive as employees than humans who bring all sort of personal issues and benefit challenges to work. In his view, a substantial portion of the routine, specialized jobs held by average people, including college graduates, simple do not need the breadth of a human being.
Mr. Ford was surprised to find that economists aren’t contemplating the impact of artificial intelligence in their models of the future of the economy while technologists are. Technology could replace a large fraction of the human workforce, which could lead to large structural unemployment. As Mr. Ford points out, economists treat technological advancement as always leading to more prosperity and more jobs. A recent New York Times article focused on how businesses have increased their investment in capital assets (technology) rather than hire more workers. Historically, technology and the market economy have worked together to make us all wealthier, but will this always be true?
A free market economy cannot work without a viable labor market. Jobs are the primary mechanism through which income (purchasing power) is distributed to people who consume what the economy produces. Since technology appears to be accelerating, its impact on the labor force may come sooner than we expect or are prepared to deal with. This observation resonates in light of the increase in the number of workers experiencing long-term unemployment. Are they the early victims of technology’s damage to the functioning of our current labor market and ultimately our economic recovery?
Mr. Ford argues that the Luddite Fallacy is merely an historical observation, rather than a rule that applies indefinitely into the future. The Luddite Fallacy is the belief that employers will use machines to boost worker productivity but then will lay off workers to keep total output stagnant and to prevent prices from falling. This is in contrast to the neoclassical economic belief that automation increases productivity of the workforce, which leads to lower prices for goods and services. Lower prices increase consumer demand and in order to meet the rising demand, employers hire more workers.
Mr. Ford suggests that as information technology and artificial intelligence advance, robots and other forms of automation will ultimately result in significant unemployment as machines and software begin to match and exceed the capability of workers to perform most routine jobs. As robotics and artificial intelligence develop further, even many skilled jobs may be threatened. Technologies such as machine learning may ultimately allow computers to do many knowledge-based jobs that require significant education. This may result in substantial unemployment at all skill levels, stagnant or falling wages for most workers, and increased concentration of income and wealth as the employers capture an ever larger portion of the economy’s rewards. This in turn could lead to depressed consumer spending and economic growth as most of the population have jobs or are dependent on someone who has a job, thus as they lose those jobs or access to them, they will no longer have sufficient income to purchase the products and services produced by the economy.
There are several assumptions made by neoclassical economists about technology. These include the idea that machines are little more than tools used by workers and that they increase the worker’s productivity. Also, the vast majority of workers in the population are capable of becoming machine operators. Mr. Ford asks: What happens if these assumptions prove wrong?
He believes that advanced machine automation will come to low wage countries as well as developed economies. Mr. ford pointed to a 2003 article in AutomationWorld that automation was causing significant job losses in Brazil, India and China. In addition, the article went on to say, income inequality worldwide was becoming a greater issue due to the impact of labor automation, and that this issue would only get worse with accelerating automation technology.
In the extreme case Mr. Ford develops in his book, the U.S. could experience upwards of a 75% unemployment rate by 2030, which would cripple our nation since there would be no money to purchase our economy’s output. As a result, Mr. Ford suggests a massive restructuring of the economy and society to level income and reduce the huge gap between rich and poor. He would establish a tax structure that would essentially recoup the proportion of companies’ income that was previously represented by wage costs and he would have the government redistribute those funds to the unemployed so they would have significantly higher incomes than they currently get from unemployment payments. He believes business owners would welcome that restructuring because they need customers and they still would actually increase their profitability from today’s profit and loss structure as they do away with employee benefits and other employment-related costs that are actually not paid out in the form of wages.
In Mr. Ford’s new world, businesses would become more profitable, they would have significantly fewer workers, the huge numbers of unemployed workers would enjoy healthy incomes and satisfying lives as they have the money and time to engage in more socially-satisfying and educationally-enjoyable endeavors. We doubt this fantasy world will ever evolve, but merely reading Mr. Ford’s book is provocative for considering the economic and social ramifications of economies run by robots and artificially intelligent machines. It also makes one think about the impact this fantasy world would have on future energy demand, and how we would get and distribute energy that would still be needed to support economic activity.
People Starting To Connect Dots Of Renewable Energy Cost (Top)
The Deepwater Wind demonstration energy project to be located off the coast of Block Island in the coastal waters of Rhode Island is moving along slowly. The slow pace is dictated as Deepwater Wind awaits the state Supreme Court’s ruling on the validity of the electricity purchase contract for the wind farm’s output. Since the power contract was approved last fall, the project has been dealt several body blows. While visual pollution remains an aggravating problem for the residents of Block Island, they are now beginning to see what their electricity costs will be and how much profit Deepwater Wind will make from the project. We recently learned that Deepwater Wind plans to use taller wind turbines than originally designed, so we believe on a clear day we may see them from our house on the Rhode Island shore. We already know we will be paying for a portion of the additional cost of their output.
A consulting firm was hired several years ago by Deepwater Wind to prepare maps of Block Island that would show those areas of the island that would be able to see some portion of the wind turbines during the summer when trees and bushes are in full leaf and also during the winter without the foliage. The maps were prepared from satellite data so they don’t completely show what people will actually encounter – either positively or negatively. At the time the maps were prepared for a presentation by Deepwater Wind to the Rhode Island Historical Preservation and Heritage Commission two and a half years ago, the plan was to install 3.6-megawatt (MW) wind turbines standing 436-feet tall. Now the company has decided to use either 5-MW or 6-MW turbines that will stand 50-feet taller than the original ones, or 486-feet tall.
Exhibit 8. Islanders’ Summer Views Of Turbines
Source: Saratoga Associates
The first map shows those areas in red that will see some part of the turbines during the summer. Those areas were determined by assuming a solid 15-foot wall of greenery everywhere on the island. That assumption is quite different from the variable landscape of trees and bushes that actually exists. As one Block Island resident pointed out, many of the Japanese Black Pine and native Pitch Pine trees on the island are dying due to the black turpentine beetle that has infiltrated the forests. That means the existing foliage barrier will shrink in the future.
Exhibit 9. Islanders’ Winter Views Of Turbines
Source: Saratoga Associates
During winter months when there is no foliage, much more of the island will be able to see the wind turbines, especially now that they will be taller. Notably, the tallest point on Block Island is Beacon Hill at 211-feet. The Southeast Light, a national historic landmark and
Exhibit 10. Southeast Lighthouse To Be Dwarfed
Source: Photo Cara Call, Lighthouse.cc
the highest positioned lighthouse in New England, sits on Mohegan Bluff, 200-feet above the water. The light tower is 261-feet tall, making its tip 461-feet above sea level, which will be dwarfed by the new wind turbines.
The first real setback to the Deepwater Wind project was the Public Utility Commission’s (PUC) rejection of the first Power Purchase Agreement (PPA) between the company and National Grid (NGG-NYSE). That development created a scramble by the governor and the legislature to amend the state’s renewable power law to assure that the PPA would be approved once it was slightly modified and then resubmitted for approval by the PUC. It was approved, but that led to a court challenge over the terms of the PPA and the process in which it was approved. It is about these challenges that Deepwater Wind is awaiting a ruling. Financially, the project’s ability to raise the necessary funds to construct the turbines was hit by the announcement that the loan guarantee application was placed on hold because funding for the federal program was cut in the December budget agreement between Congress and the Obama administration.
The CEO of Deepwater Wind earlier had testified before the PUC that “a loan guarantee is practically required in order to attract the requisite investment for the project to move forward.” He later said that the project could still attract financing without the guarantee. Recently, Deepwater Wind’s Chief Development Officer Paul Rich told the Electricity Utility Task Force on Block Island, “Whether that [the absence of a federal loan guarantee] will make or break the project is going to be continued to be reviewed.”
The latest blow to the project was when its SeaZephIR spar buoy with its wind measuring system that was briefly deployed off the island’s west side, suffered a structural failure. Fortunately, the hard drive and back-up storage systems were recovered and proved the system functioned properly to collect wind data. The buoy is undergoing a forensic analysis to determine why it failed and how to correct it. Deepwater Wind will still need to collect wind data in order to finish the design of the project and the location of the wind turbines.
While all of this has been transpiring, we have seen multiple letters to the editors of Rhode Island newspapers tying a number of dollar-and-cents issues about the wind project together to show how much it will really cost the public. Mr. Rich told Block Island residents that Deepwater Wind stands to enjoy $230 million in project-level free cash flow. This comes as National Grid has updated its estimate that ratepayers will pay $415 million in excess of market costs for renewable energy for the privilege of buying power from Deepwater Wind’s wind turbines. And Rhode Island economic development officials testified before the PUC that the total economic benefit to the state from the wind project will be $129 million. Moreover, as Paul Roberti, a member of the PUC has stated, the Deepwater Wind project will create only six permanent jobs, certainly negating the “green jobs” claims of the state’s renewable energy law.
In the end, the “economic detriment” for Rhode Island ratepayers totals $286 million, of which, under the PPA contract, $230 million will be transferred directly to Deepwater Wind as project profits. With the third highest unemployment rate (10.9% in April), stagnant wages, Rhode Island cities going into receivership and more on the brink of bankruptcy, and home prices continuing to fall, the public is grasping that offshore wind is too expensive a source of electricity.
U.S. Hurricane Forecast Explains How 2011 Could Be Worse (Top)
We recently participated in a webinar conducted by ImpactWeather, a Houston-based company and a subsidiary of Universal Weather and Aviation, which provides weather forecasts to corporations and organizations designed to assist them in meeting their safety, security, operational and risk management responsibilities. The webinar was focused on ImpactWeather’s 2011 hurricane forecast, which is not terribly different from the forecasts issued by most of the other long-standing public and private forecasters. ImpactWeather is calling for 14 named storms, eight hurricanes and four major hurricanes this storm season. Their forecast is above the normal number of storms and hurricanes experienced over 1950-2010, but slightly below the 1995-2010 average for named storms (14 vs. 15), but exactly in line with the number of hurricanes and major hurricanes.
What we found interesting and different about this presentation was that Chris Hebert, the manager of ImpactWeather’s Tropics Watch service, showed how their forecast for storms differs from last year’s activity. While the webinar focused on the next 30 days, Mr. Hebert was able to put this period into perspective. Historically, 65% of named tropical storms develop after August 31st, which is based on an analysis of the frequency of storms over the last century. Interestingly, there is usually a spike in storm activity during June, but it principally is related to tropical depressions and distributions. While there was a disturbance active in the Caribbean at the time of the webinar, Mr. Hebert suggested it would probably amount to nothing, which proved to be true.
As many know, last year was a very active storm-year with 21 named storms, 12 hurricanes and five major hurricanes. That compares with the normal year during 1950-2010 having 11 named storms, six hurricanes and two major hurricanes. One of the most significant developments last year was that there were no hurricanes making U.S. landfall. Only Tropical Storm Bonnie landed on the upper Texas Gulf Coast. It will probably be worse this year.
The normal pattern for storms is shown in the following two charts from AccuWeather.com. The first chart (Exhibit 11) shows the typical pattern for tropical storms that are generated over the warm waters off the African coast and move across the Atlantic basin into the warm waters of the Caribbean Sea and the coastal waters of Central America, Mexico and the United States.
Exhibit 11. How Hurricanes Form
As the second chart (Exhibit 12) demonstrates, there is a normal travel pattern these storms follow as the move westward from the African coast. Part of the pattern is shaped by the location of the Bermuda high pressure mass that normally sits in the middle of the Atlantic Ocean and just how strong it is. The stronger the high pressure mass, the larger it tends to be and the greater its ability to push storms further westward before they begin turning northward. Exactly when the storms turn north is influenced by the location of high and low pressure centers over the eastern half of the United States continent.
Exhibit 12. How Hurricanes Reach The U.S.
What we saw last year was that the Bermuda High was weaker than normal and located further to the northeast from its normal position. This location, coupled with a low pressure mass centered in the northeast region and that extended down along the East Coast and a high pressure center positioned along the Gulf Coast, tended to push storms northward sooner and thus kept most of them away from the U.S. coastline.
Exhibit 13. History Of 2010 Hurricane Season
Source: Weather Underground
In laying out the case for an active 2011 storm season with increased risk to the U.S. coastline, Mr. Hebert pointed out that last year the hurricane season was dominated by La Niña in the Pacific Ocean that helped the formation of tropical storms and their strengthening. Since last year La Niña has weakened and has almost totally disappeared. This year will find the Pacific Ocean in a neutral zone between El Niño and La Niña. Mr. Hebert also showed a chart reflecting the annual variation of Atlantic basin sea surface temperatures from their normal level. This variation normally runs in 25-40 year cycles. The current warming cycle began in 1995 so we could be looking at another 10-25 years of warmer sea surface temperatures. At the present time, the temperatures are above normal, but cooler than last year, which could reduce marginally the number of storms forming this year.
ImpactWeather’s key consideration in their forecast is that the Bermuda high will be located in its normal position, meaning it will be further south and west from last year’s position and that it will be stronger than last year. Additionally, the trade winds should be stronger in the deep tropics of the eastern portion of the Caribbean. Stronger trade winds mean increased low-level wind shear compared to last year that could break up storms or limit their strengthening. Because Mr. Hebert foresees the high pressure center that was parked on the Gulf Coast last year being positioned further north, and the eastern low pressure center located more westerly, he sees increased risk for storms coming into the Gulf of Mexico and targeting the Louisiana/Texas coasts. There is also the possibility storms could turn northward earlier and impact the Florida and Mid-Atlantic coastline.
Exhibit 14. The Likely Course For Hurricanes This Season
Despite ImpactWeather’s storm forecast being mainstream, their webinar did an excellent job of explaining why the energy business and Gulf Coast residents need to be more alert than last year. We will continue to monitor the development of the weather ingredients that influence the number, intensity and trajectory of tropical storms this summer. Readers should note that the names to be used for storms this year are based on the recycled 2005 list with the exception of five names that have been retired. The most famous of those five retired names are Katrina and Rita, at least for Gulf Coast residents. Those names have been replaced by Katia and Rina. The other three retired names and their replacements are: Dennis/Don; Stan/Sean; and Wilma/Whitney. We wonder what name, if any, will be the next one to enter weather history.
1900 St. James Place, Suite 125
Houston, Texas 77056
Main Tel: (713) 621-8100
Main Fax: (713) 621-8166
Parks Paton Hoepfl & Brown is an independent investment banking firm providing financial advisory services, including merger and acquisition and capital raising assistance, exclusively to clients in the energy service industry.