Let's talk a little about CHEMTRAILS and other climate/weather changing technology research that is indeed under way these few decades. Even scientists are shouting that all this can have devastating consequences and it does involve releasing more and more chemicals that will harm public health. The reality of this approach that necessitates global cooperation sustained for hundreds or thousands of years is crazy when all we have to do is reverse global markets and stop building International Economic Zones with global corporate campuses and global factories.
'HAARP technology for weather modification, in conjunction with chemtrails, use the ionosphere and particulate matter in the stratosphere (chemtrails) to reflect energy back and forth in order to control jet streams, which in turn cause the desired weather patterns'.
I wanted to share this article below to show the wide interpretation of what all this science means or has for a goal. The article below is from a right-leaning citizen who as I posted still thinks global warming is a conspiracy theory. She is right in saying the ploy of creating small businesses to MOVE FORWARD all this environmental policy aimed at GLOBAL CORPORATE SUSTAINABILITY FOR THE 1% is bad economics. When we allow the far-right Wall Street global corporate neo-liberals write green policy PRETENDING TO BE LEFT-LEANING----everyone is going to hate environmental policy.
THIS IS NOT LEFT-LEANING ENVIRONMENTAL POLICY.
This article has much of the technology concerns and identify chemicals used in many of these research projects----and the goals below do often meet real goals. There are also many descriptions that are not real and this is what makes receiving information in the US so difficult....we no longer have a public interest government with public interest public universities creating public interest research data----data now is released to hide dangers to public interest that keep products that benefit patents or corporate power and profit from MOVING FORWARD.
The danger in this article is a citizen not taking this issue of global warming SERIOUSLY----still against corporate taxes for environmental issues----while focusing on science that often is not true. The concern over ONE WORLD control by global corporate tribunal is real---they do have only the 1% interests as a goal. We cannot work as 99% to the 1% when these differing views on public policy create factions.
THIS IS NOT A CORPORATE TAX ON BREATHING----IT IS CORPORATE TAXATION TO FUND MITIGATION OF ENVIRONMENTAL DAMAGES AND IT BUILD REAL PUBLIC INTEREST SUSTAINABILITY. THE ESTATE TAX IS NOT A TAX ON DYING----IT IS A TAX TO KEEP AMERICA FROM EXTREME WEALTH AND EXTREME POWER COLONIZATION BY A FEW ABLE TO ACCUMULATE TOO MUCH MONEY...LIKE NOW.
In Maryland----the RAIN TAX was indeed bogus progressive posing to just soak Maryland and Baltimore citizens of more tax revenue. It was written to allow the worst violations of surface impermeability go without paying this tax. The far-right writes tax policy to be regressive----while real environmental taxation is aimed at corporations, not small business and individuals.
'The global warming scam
So what to do? Well, like any good entrepreneur, you can take advantage of disaster capitalism to make a few bucks. Want to make a bit of money by creating a global warming scam? Spray a few chemtrails, melt a bit of ice, take a picture of a polar bear surrounded by water, and WALLA! a global warming crisis that can be used to manipulate the public into accepting a carbon tax for breathing'.
What do HAARP, Chemtrails, and Global Warming all have in common?
By Barbara H. Peterson
Farm Wars
Chemtrails are bad news. They contain barium and aluminum, which have the potential to destroy ecosystems around the world. But what if this destruction is merely an inevitable and acceptable consequence of a much larger program?
A program intended to be part of an all-in-one solution to global control and manipulation?
HAARP
A link to the patent (4,686,605) at the patent office website is here.
What is the purpose of HAARP? To understand this, we can look to information in the patent:
This invention has a phenomenal variety of possible ramifications and potential future developments. As alluded to earlier, missile or aircraft destruction, deflection, or confusion could result, particularly when relativistic particles are employed. Also, large regions of the atmosphere could be lifted to an unexpectedly high altitude so that missiles encounter unexpected and unplanned drag forces with resultant destruction or deflection of same.
Weather modification is possible by, for example, altering upper atmosphere wind patterns or altering solar absorption patterns by constructing one or more plumes of atmospheric particles, which will act as a lens or focusing device.
Also as alluded to earlier, molecular modifications of the atmosphere can take place so that positive environmental effects can be achieved. Besides actually changing the molecular composition of an atmospheric region, a particular molecule or molecules can be chosen for increased presence. For example, ozone, nitrogen, etc. concentrations in the atmosphere could be artificially increased. Similarly, environmental enhancement could be achieved by causing the breakup of various chemical entities such as carbon dioxide, carbon monoxide, nitrous oxides, and the like.
Transportation of entities can also be realized when advantage is taken of the drag effects caused by regions of the atmosphere moving up along diverging field lines. Small micron sized particles can be then transported, and, under certain circumstances and with the availability of sufficient energy, larger particles or objects could be similarly affected. Particles with desired characteristics such as tackiness, reflectivity, absorptivity, etc., can be transported for specific purposes or effects. For example, a plume of tacky particles could be established to increase the drag on a missile or satellite passing therethrough. Even plumes of plasma having substantially less charged particle density than described above will produce drag effects on missiles which will affect a lightweight (dummy) missile in a manner substantially different than a heavy (live) missile and this affect can be used to distinguish between the two types of missiles.
A moving plume could also serve as a means for supplying a space station or for focusing vast amount of sunlight on selected portions of the earth.
Surveys of global scope could also be realized because the earth’s natural magnetic field could be significantly altered in a controlled manner by plasma beta effects resulting in, for example, improved magnetotelluric surveys.
Electromagnetic pulse defenses are also possible. The earth’s magnetic field could be decreased or disrupted at appropriate altitudes to modify or eliminate the magnetic field in high Compton electron generation (e.g., from high altitude nuclear bursts) regions. High intensity, well controlled electrical fields can be provided in selected locations for various purposes. For example, the plasma sheath surrounding a missile or satellite could be used as a trigger for activating such a high intensity field to destroy the missile or satellite.
Further, irregularities can be created in the ionosphere which will interfere with the normal operation of various types of radar, e.g., synthetic aperture radar.
The present invention can also be used to create artificial belts of trapped particles which in turn can be studied to determine the stability of such parties.
Still further, plumes in accordance with the present invention can be formed to simulate and/or perform the same functions as performed by the detonation of a “heave” type nuclear device without actually having to detonate such a device. Thus it can be seen that the ramifications are numerous, far-reaching, and exceedingly varied in usefulness.
In a nutshell, we can see that HAARP is intended to do the following:
- Destroy, deflect, or confuse missiles or aircraft
- Modify the weather
- Change the molecular composition of an atmospheric region, by increasing or decreasing ozone, nitrogen, carbon dioxide, carbon monoxide, and/or nitrous oxides
- Increase the drag on a missile or satellite
- Focus vast amounts of sunlight on selected portions of the earth.
- Alter the earth’s magnetic field for things such as surveying
- Create high intensity, well controlled electrical fields in selected locations for various purposes
- Create irregularities in the ionosphere, which will interfere with the normal operation of various types of radar, e.g., synthetic aperture radar
- Create plumes to simulate and/or perform the same functions as performed by the detonation of a “heave” type nuclear device without actually having to detonate such a device.
U.S. Patent # 4,042,196 1977
There is disclosed method and apparatus for triggering a substantial change in ionospheric characteristics of the earth and measuring certain selected characteristics of the earth. Substantial energetic particle precipitation is triggered through injection of low energy ionized gas, such as hydrogen, in the region of large fluxes of energetic particles in or near the magnetic equator. The loss process is known to occur naturally but a triggered change is achieved through injection of larger amounts of low-energy ionized gas than are naturally present, preferably in the cusp region, which usually extends inside the synchronous orbit for several hours about local midnight.
With this information in hand, the inventor of HAARP was ready to reveal technology that contains one of the most dastardly plans for global manipulation possible – selective weather modification.
What do chemtrails have to do with it?
According to Dr. Michael Castle,
Chemtrails over Farm Wars property
HAARP is utilized for many clandestine missions, of which weather modification is a fundamental objective. Microwave, Extreme Low Frequency (ELF), Very Low Frequency (VLF) and other EMR/EMF-based systems are transmitted into the atmosphere and reflected by the ionosphere back through the Earth’s Stratosphere/Atmosphere where various airborne chemical particulates, polymer filaments and other electromagnetic frequency absorbers and reflectors are used to push or pull the prevailing jet-streams to alter weather patterns. In many instances, drought inducement technologies have been found in patented systems. Drought inducement occurs, according to reviewed technologies, by heating the stratosphere with microwaves, placing airborne chemical particulates in the airspace and thereby changing the base-line moisture gradients via microwaves from HAARP and desiccating regions chemically with barium titanates, methyl aluminum and potassium mixtures.
In a nutshell:
Jets spray chemtrails consisting of barium titanates, methyl aluminum and potassium mixtures in the stratosphere. These chemtrails are used to reflect and absorb radio and electromagnetic frequencies that are induced by HAARP, and reflected back from the ionosphere. This activity is used to push or pull jet streams, which alter weather patterns. This manipulation can result in all sorts of different weather patterns, including storms, drought, extreme cold, heat, etc.
In the late 1950’s, it was discovered that naturally occurring belts exist at high altitudes above the earth’s surface, and it is now established that these belts result from charged electrons and ions becoming trapped along the magnetic lines of force (field lines) of the earth’s essentially dipole magnetic field. The trapped electrons and ions are confined along the field lines between two magnetic mirrors, which exist at spaced apart points along those field lines. The trapped electrons and ions move in helical paths around their particular field lines and “bounce” back and forth between the magnetic mirrors. These trapped electrons and ions can oscillate along the field lines for long periods of time. (HAARP patent)
HAARP technology for weather modification, in conjunction with chemtrails, use the ionosphere and particulate matter in the stratosphere (chemtrails) to reflect energy back and forth in order to control jet streams, which in turn cause the desired weather patterns.
The global warming scam
So what to do? Well, like any good entrepreneur, you can take advantage of disaster capitalism to make a few bucks. Want to make a bit of money by creating a global warming scam? Spray a few chemtrails, melt a bit of ice, take a picture of a polar bear surrounded by water, and WALLA! a global warming crisis that can be used to manipulate the public into accepting a carbon tax for breathing. Or how about creating storms that wipe out cities that can be rebuilt in a more pleasing manner for the rich and powerful? You could wipe out local farming communities by creating drought in certain areas, declare a particular fish endangered, then take the farmers’ water in the name of ecology, and allow them to go bankrupt. The land could then be used for a more suitable purpose such as a wildlife preserve so that the rich and powerful can have a playground.
The possibilities are endless, and the cost? Well, the American people can pay for it with their tax dollars. The ramifications? What are a few billion deaths from chemical poisoning, starvation, malnutrition, and disease? Why, a much needed benefit, of course. After all, according to the Georgia Guidestones, the population needs to be reduced to 500 million in order to maintain the earth in the manner that the rich and powerful New World Order mafia would like it to be.
(C) 2010 Barbara H. Peterson
____________________________________________
Above we see a citizen identifying HAARP as a problem in changing global weather patterns and as we see below last decade had plenty of citizens including scientists shouting that HAARP as a military project was bad ---these two issues are now being brought together with the reality that both are bad for global climate, public health, and bad defense policy.
We see as well where all that energy for SUSTAINABILITY is going----these massive computer/data mega-centers require tons of energy to operate---tons of energy to keep equipment cool----and there will not be SMART METERS telling them to reduce their energy usage or pay soaring rates.
Below we see the concerns from a military point of view. If one is concerned about a 1% Wall Street ONE WORLD domination then having these technology tied to control of national airwaves is authoritarian and is why many citizens are protesting these defense systems. This is behind the consolidation of all low frequency airwaves happening now to global corporation control ----meaning the 99% of global citizens will have no control over these frequencies having the power of communication locally and nationally.
IT TIES MORE AND MORE OF THAT LIMITED LOWER FREQUENCY TO THE 1% WALL STREET.
So, how does this tie to climate change and environmentalism?
CHEMTRAILS HAARP BLUEBEAM
Uploaded on May 11, 2009
The United States has three ionospheric heating facilities: the HAARP, the HIPAS, near Fairbanks, Alaska, and (currently offline for modifications) one at the Arecibo Observatory in Puerto Rico. The European Incoherent Scatter Scientific Association (EISCAT) operates an ionospheric heating facility, capable of transmitting over 1 GW [5] (1,000,000,000 watts) effective radiated power (ERP), near Tromsø in Norway. Russia has the Sura ionospheric heating facility, in Vasilsursk near Nizhniy Novgorod, capable of transmitting 190 MW ERP.
Another site, operated by military sub-contractor under unknown arrangement between the US and Canadian government, is located near Cape Race, Newfoundland, Canada, at N46° 38.649' W53° 9.010' There is minimal or no grid power available at this site, so this may be a passive listening post for the transmissions emitted by other HAARP sites.
In August 2002, further support for those critical of HAARP technology came from the State Duma (parliament) of Russia. The Duma published a critical report on the HAARP written by the international affairs and defense committees, signed by 90 deputies and presented to then President Vladimir Putin. The report claimed that "the U.S. is creating new integral geophysical weapons that may influence the near-Earth medium with high-frequency radio waves ... The significance of this qualitative leap could be compared to the transition from cold steel to firearms, or from conventional weapons to nuclear weapons. This new type of weapons differs from previous types in that the near-Earth medium becomes at once an object of direct influence and its component." However, given the timing of the Russian intervention, it is possible that it was related to a controversy at the time concerning the US withdrawal in June 2002 from the Russian-American Anti-Ballistic Missile Treaty. This high level concern is paralleled in the April 1997 statement by the U.S. Secretary of Defense over the power of such electromagnetic weaponry. Russia owns and operates an ionospheric heater system as powerful as the HAARP[6], called 'Sura,' which is located roughly 150 km from the city of Nizhny Novgorod.
The objectives of the HAARP project became the subject of controversy in the mid-1990s, following claims that the antennas could be used as a weapon. A small group of American physicists aired complaints in scientific journals such as Physics and Society[5], charging that the HAARP could be seeking ways to destroy or disable enemy spacecraft[citation needed] or disrupt communications over large portions of the planet. The physicist critics of the HAARP have had little complaint about the project's current stage, but have expressed fears that it could in the future be expanded into an experimental weapon, especially given that its funding comes from the Office of Naval Research and the Air Force Research Laboratory.[citation needed]
These concerns were amplified by Bernard Eastlund, a physicist who developed some of the concepts behind the HAARP in the 1980s and proposed using high-frequency radio waves to beam large amounts of power into the ionosphere, energizing its electrons and ions in order to disable incoming missiles and knock out enemy satellite communications. The US military became interested in the idea as an alternative to the laser-based Strategic Defense Initiative. However, Eastlund's ideas were eventually dropped as SDI itself mutated into the more limited National Missile Defense of today. The contractors selected to build HAARP have denied that any of Eastlund's patents were used in the development of the project.
After the physicists raised early concerns, the controversy was stoked by local activism. In September 1995, a book entitled Angels Don't Play This HAARP: Advances in Tesla Technology by the former teacher Nick Begich, Jr., son of the late Congressman Nick Begich (D-AK) and brother of U.S. Senator Mark Begich (D-AK), claimed that the project in its present stage could be used for "geophysical warfare
___________________________
The confusion for most Americans is this combined effort both military and climate with the cross-purpose of air-borne chemical/biochemical release. The statement above addresses the huge uptick in jet airtime is these projects covering entire regions of the nation. That is lots of jet fuel---and lots of carbon/methane release that does increase global warming. We see how our NOAA is outsourced to global corporations so the data coming from an NOAA will not be public interest---and all Federal funding going to GREEN TECHNOLOGY----especially battery and cable are going to these projects having nothing to do with creating REAL environmental, local energy sources.
'Raytheon also reports the weather for NOAA through its Advanced Weather Information Processing System. According to researcher Brendan Bombaci of Durango, Colorado, these Raytheon computers are directy linked with their UAV weather modification drones. Bombaci reports that NOAA paid Raytheon more than $300 million for this "currently active, 10-year project."'
This article addresses the science behind public health issues from chemicals released. There are projects like this in place where manipulation of global weather patterns are being tested. This is where many scientists are shouting that some technologies involved can easily go bad----creating weather patterns that worsen our global warming situation rather than helping it. Now, what is as bad for WE THE PEOPLE is this: Federal funding being sold as research for environmental solutions AND Federal funding being sold as research to advance biomedicine-----are often being sent to build these systems.
SMART CITIES -----SMART WEATHER------NOT SO SMART
At a time when Congress is eliminating all social benefit programs---they have this THERE IS NO LIMIT FUNDING TO WHAT ARE VERY MARGINAL INNOVATIVE IDEAS. We are losing trillions of dollars on research in these areas. Think of the annual costs of maintaining these kind of systems even if they do perform some service----it's the same as the cost of global cyber-security -----when simply reversing global policies and International Economic Zone policies will reverse and slow all these climate and DEFENSE problems.
NANO CHEMTRAILS
by William Thomas
If you did not enjoy "traditional" chemtrails raining down on you, you are not going to like the new version, which the United States Air Force promises will feature aerial dumps of programmable "smart" molecules tens of thousands of times smaller than the particles already landing people in emergency rooms with respiratory, heart and gastrointestinal complaints.
Under development since 1995, the military's goal is to install microprocessors incorporating gigaflops computer capability into "smart particles" the size of a single molecule.
Invisible except under the magnification of powerful microscopes, these nano-size radio-controlled chips are now being made out of mono-atomic gold particles. Networked together on the ground or assembling in the air, thousands of sensors will link into a single supercomputer no larger than a grain of sand.
Brought to you by the same military-corporate-banking complex that runs America's permanent wars, Raytheon Corp is already profiting from new weather warfare technologies. The world's fourth largest military weapons maker bought E-Systems in 1995, just one year after that military contractor bought APTI, holder of Bernard Eastlund's HAARP patents.
Raytheon also owns General Dynamics, the world's leading manufacturer of military Unmanned Aerial Vehicles.
Raytheon also reports the weather for NOAA through its Advanced Weather Information Processing System. According to researcher Brendan Bombaci of Durango, Colorado, these Raytheon computers are directy linked with their UAV weather modification drones. Bombaci reports that NOAA paid Raytheon more than $300 million for this "currently active, 10-year project."
She goes on to describe the Joint Environmental Toolkit used by the U.S. Air Force in its Weather Weapons System. Just the thing for planet tinkerers.
GREEN LIGHT
For public consumption, nano-weather control jargon has been sanitized. "Microelectric Mechanical Sensors" (MMS) and "Global Environmental Mechanical Sensors" sound passively benign. But these ultra-tiny autonomous aerial vehicles are neither M&Ms nor gems. [Space.com Oct 31/05]
According to a U.S. military flier called Military Progress, "The green light has been given" to disperse swarms of wirelessly-networked nano-bots into the troposphere by remotely-controlled UAV drones for "global warming mitigation."
U.S. Army Tactical Unmanned Aerial Vehicles, as well as U.S. Air Force drones "are slated to deploy various payloads for weather warfare," Military Progress asserts. This dual mission - to slow global warming and use weather as a weapon - seems somewhat contradictory.
FIGHTING FOR CLIMATE CHANGE
U.S. Military Inc. is already in the climate change business big time. The single biggest burner of petroleum on this planet, its high-flying aircraft routinely rend Earth's protective radiation shielding with nitrous oxide emissions, while depositing megatons of additional carbon, sulfur and water particles directly into the stratosphere - where they will do three-times more damage than CO2 alone.
Go figure. A single F-15 burns around 1,580 gallons an hour. An Apache gunship gets about one-half mile to the gallon. The 1,838 Abrams tanks in Iraq achieve five gallons to the mile, while firing dusty radioactive shells that will continue destroying human DNA until our sun goes supernova.
A single non-nuclear carrier steaming in support burns 5,600 gallons of bunker fuel in an hour - or two million gallons of bunker oil every 14 days. Every four days, each carrier at sea takes on another half- million gallons of fuel to supply its jets.
The U.S. Air Force consumed nearly half of the Department of Defense's entire fuel supply in 2006, burning 2.6 billion gallons of jet fuel aloft.
While flying two to five-hour chemtrails missions to reflect incoming sunlight and slow global warming, a single KC-10 tanker will burn 2,050 gallons of highly toxic jet fuel every hour. The larger and older KC-135 Stratotanker carries 31,275 gallons of chemtrails and burns 2,650 gallons of fuel per hour.
The EPA says that each gallon of gasoline produces 19.4 pounds of CO2. Each gallon of diesel produces 22.2 pounds of CO2.
Total it up and routine operations by a military bigger than all other world militaries combined puts more than 48 billion tons of carbon dioxide into the atmosphere every year. Nearly half that total could be eliminated by ending the wars against Iraq and Afghanistan. [TomDispatch.com June 16/07; huffingtonpost.com Oct 29/07]
NANO RAIN
Meanwhile, the 60 year quest for weather warfare continues. Though a drone cannot carry a heavy payload, more sub-microscopic weather modification particles can be crammed into a UAV Predator than all the chemtrail slurry packed into a tanker the size of a DC-10. T
According to the air force's own weather modification study, Owning The Weather 2025, clouds of these extremely teeny machines will be dropped into hurricanes and other weather systems to blend with storms and report real time weather data to each other and a larger sensor network.
Then these smart particles will be used to increase or decrease the storm's size and intensity - and "steer" it to "specific targets".
The air force report boasted that nano-chemtrails "will be able to adjust their size to optimal dimensions for a given seeding situation and make adjustments throughout the process." Instead of being sprayed into the air at the mercy of the winds aloft, as is the fate of normal chemtrails, nano versions will be able to "enhance their dispersal" by "adjusting their atmospheric buoyancy" and "communicating with each other" as they steer themselves in a single coordinated flock within their own artificial cloud.
Nano-chemtrails will even "change their temperature and polarity to improve their seeding effects," the air force noted. [Daily Texan July 30/07]
Rutgers University scientist J. Storrs Hall held out the military's hope that these new nano weather-warrior bots:
"Interconnected, atmospherically buoyant, and having navigation capability in three dimensions - clouds of microscopic computer particles communicating with each other and with a control system, could provide tremendous capability."
_________________________________________
GREEN TECHNOLOGY funding addressing damages caused by International Economic Zone and global factories is tied to what we are going to do to replace----natural products. This article is tied to the CHEMTRAIL article in that all these processes are tied to nanotechnology which as this article rightly states will not result in products WE THE PEOPLE will be able to afford or access. As with CHEMTRAIL technology the spent energy and water in these processes ----the chemical pollutions----negate much of the claims of ENVIRONMENTALISM OR BEING GREEN.
We are watching as the 1% Wall Street create artificial food that will come to the 99%----recycled waste water coming to the 99%----and when that 5% rise in temperature and over-harvesting of timber takes all our natural wood----here's the coming fix if you can afford it.
THIS IS CALLED DESIGNER MANUFACTURING ---IT WILL BUILD PRODUCTS THAT ONLY THE 1% AND THEIR 2% CAN AFFORD.
It is far-right 1% Wall Street to be spending all funding on what to do after global warming kills everything rather than stopping and reversing global market and global corporate campus economic structures to protect against these changes.
'The imperative thing is that these products — nanocomposites — are so expensive, and only privileged people are able to access and utilize them. So, from the above explanation, can we really say that nanocomposite-based wood (natural fibers) can be applicable and eco-friendly?
Is green nanocomposite-based wood an eco-friendly product?
Photo by Ollivier Girard for Center for International Forestry Research (CIFOR).
I believe, in this era, human beings can fully ascertain what nanotechnology is. Nanotechnology is basically an evolved and highly-developed innovative technology in the form of nano-sized products or machines (1-100 nanometers). There are two crucial traits of the technology, which are that products have higher (or the highest possible) strength properties compared to preliminary materials, and that the products are very small-scale, namely nano size.
Most expertise and research that concerns nanotechnology also campaigns for green composite-based nanotechnology. Researchers argue that because nanotechnology-based products are characterized by and isolated from natural fibers, they are environmentally friendly.
We can gauge the interest of human beings in benefiting from nanotechnology, but most people don’t realize that nanotechnology has existed since long ago. The Big Bang, for instance, was one of the natural implications of the technology. However, nanotechnology can also occur artificially through the associated high-level technology.
Apparently, nanotechnology has only recently been known about by societies of this decade, in which green nanotechnology is a booming part of human life. Green nanotechnology can be achieved by manufacturing green nanocomposite-derived lignocellulosic materials like wood fibers. Why do we use lignocelluloses from wood? Wood can produce renewable ultra-long fibers, which have super-strength properties both chemically and physically.
Due to the abundance of wood fibers in the world, synthetic fibers from non-renewable resources are substituted by natural fibers, especially those from wood. In terms of producing nanocomposite-based wood, we can say there are two sides of such green products, which are their biodegradable and renewable nature. Furthermore, such products also give high strength and stiffness combined with low weight. Due to their superior nature, most of these products are used for fabricating optical electronics, nanopapers, medical remedies, solar cells, and panel sensors.
In utilizing wood as the main filler and reinforcing material of nanoproducts, we know we should consider what kinds of wood are used — either from natural or plantation forests. In accordance with technology and innovation, wood waste can also be utilized, which can at least minimize deforestation by optimizing wood resources and reducing greenhouse gases (GHGs) by avoiding wood decomposition.
We always see the benefits of products without seeing the negative impacts of using them. In terms of nanocomposite-derived wood fibers, there is a non eco-friendly side to the products. What are these?
There are two opinions on the negative sides of nanocomposites.
First, the technology used to fabricate nanocomposites is clearly very high-tech, including ultrasonication, chemical pulping, homogenization (grinding), acid hydrolysis, steam explosion and so forth. In these processes, high levels of energy consumption are necessary. The most challenging part of utilizing energy, especially electricity, is related to climate change. Furthermore, there is a reversible use of energy that occurs when converting big-sized material to nano-size, and nano-sized to big products.
Second, one of the ways to extract nanocellulose fibers from wood needs chemical pace. One way is by using chemical substances, which are mostly not eco-friendly as they pollute the environment. For instance, bleaching and pulping chemical substances (such as NaOH, H2O2 or H2SO4) bring about water, land and air pollution by being H2T1 (harmful, hazardous and toxic). In addition, these residues can affect human health.
The imperative thing is that these products — nanocomposites — are so expensive, and only privileged people are able to access and utilize them. So, from the above explanation, can we really say that nanocomposite-based wood (natural fibers) can be applicable and eco-friendly? We should propel youth and academicians to develop effective technology to address the two key hurdles above. By giving innovative solutions to these hurdles, there will at least be benefits for our next generation and the environment.
Achmad Solikhin is part of the Indonesian Green Action Forum (IGAF).
_______________________________________________
As with biomedicine---it is not that we do not like the science----it is the fact that these technology pathways are super-sizing our way to 5% rise in temperature global warming. Whether massive solar panel plants, CHEMTRAIL biobots, global health tourism-----all these NEW INNOVATIONS are tied to technology structures requiring tremendous amounts of energy and drive the crazy fracking, oil drilling, battery technology taking not only energy but super-sizing the toxic wastes and our fresh water depletion. Super-mega computers tied to global corporate campuses and global factories----with massive energy platforms all require COOLED ENVIRONMENTS----so temperature control will see that A/C you and I wanted for our homes-----directed to these global facilities. We haven't even built the infrastructure yet but if we MOVE FORWARD with installing these global corporate campuses designated for DESIGNER PRODUCTS FOR A 1% AND THEIR 2%----we are hastening the worst of global warming.
'Even running electricity at full throttle has not been enough to satisfy the industry. In addition to generators, most large data centers contain banks of huge, spinning flywheels or thousands of lead-acid batteries — many of them similar to automobile batteries — to power the computers in case of a grid failure as brief as a few hundredths of a second, an interruption that could crash the servers'.
Power, Pollution and the Internet
By JAMES GLANZSEPT. 22, 2012
Data centers are filled with servers, which are like bulked-up desktop computers, minus screens and keyboards, that contain chips to process data. Credit Ethan Pines for The New York Times
SANTA CLARA, Calif. — Jeff Rothschild’s machines at Facebook had a problem he knew he had to solve immediately. They were about to melt.
The company had been packing a 40-by-60-foot rental space here with racks of computer servers that were needed to store and process information from members’ accounts. The electricity pouring into the computers was overheating Ethernet sockets and other crucial components.
Thinking fast, Mr. Rothschild, the company’s engineering chief, took some employees on an expedition to buy every fan they could find — “We cleaned out all of the Walgreens in the area,” he said — to blast cool air at the equipment and prevent the Web site from going down.
That was in early 2006, when Facebook had a quaint 10 million or so users and the one main server site. Today, the information generated by nearly one billion people requires outsize versions of these facilities, called data centers, with rows and rows of servers spread over hundreds of thousands of square feet, and all with industrial cooling systems.
They are a mere fraction of the tens of thousands of data centers that now exist to support the overall explosion of digital information. Stupendous amounts of data are set in motion each day as, with an innocuous click or tap, people download movies on iTunes, check credit card balances through Visa’s Web site, send Yahoo e-mail with files attached, buy products on Amazon, post on Twitter or read newspapers online.
A yearlong examination by The New York Times has revealed that this foundation of the information industry is sharply at odds with its image of sleek efficiency and environmental friendliness.
Most data centers, by design, consume vast amounts of energy in an incongruously wasteful manner, interviews and documents show. Online companies typically run their facilities at maximum capacity around the clock, whatever the demand. As a result, data centers can waste 90 percent or more of the electricity they pull off the grid, The Times found.
To guard against a power failure, they further rely on banks of generators that emit diesel exhaust. The pollution from data centers has increasingly been cited by the authorities for violating clean air regulations, documents show. In Silicon Valley, many data centers appear on the state government’s Toxic Air Contaminant Inventory, a roster of the area’s top stationary diesel polluters.
Worldwide, the digital warehouses use about 30 billion watts of electricity, roughly equivalent to the output of 30 nuclear power plants, according to estimates industry experts compiled for The Times. Data centers in the United States account for one-quarter to one-third of that load, the estimates show.
“It’s staggering for most people, even people in the industry, to understand the numbers, the sheer size of these systems,” said Peter Gross, who helped design hundreds of data centers. “A single data center can take more power than a medium-size town.”
Energy efficiency varies widely from company to company. But at the request of The Times, the consulting firm McKinsey & Company analyzed energy use by data centers and found that, on average, they were using only 6 percent to 12 percent of the electricity powering their servers to perform computations. The rest was essentially used to keep servers idling and ready in case of a surge in activity that could slow or crash their operations.
A server is a sort of bulked-up desktop computer, minus a screen and keyboard, that contains chips to process data. The study sampled about 20,000 servers in about 70 large data centers spanning the commercial gamut: drug companies, military contractors, banks, media companies and government agencies.
“This is an industry dirty secret, and no one wants to be the first to say mea culpa,” said a senior industry executive who asked not to be identified to protect his company’s reputation. “If we were a manufacturing industry, we’d be out of business straightaway.”
These physical realities of data are far from the mythology of the Internet: where lives are lived in the “virtual” world and all manner of memory is stored in “the cloud.”
The inefficient use of power is largely driven by a symbiotic relationship between users who demand an instantaneous response to the click of a mouse and companies that put their business at risk if they fail to meet that expectation.
Even running electricity at full throttle has not been enough to satisfy the industry. In addition to generators, most large data centers contain banks of huge, spinning flywheels or thousands of lead-acid batteries — many of them similar to automobile batteries — to power the computers in case of a grid failure as brief as a few hundredths of a second, an interruption that could crash the servers.
“It’s a waste,” said Dennis P. Symanski, a senior researcher at the Electric Power Research Institute, a nonprofit industry group. “It’s too many insurance policies.”
At least a dozen major data centers have been cited for violations of air quality regulations in Virginia and Illinois alone, according to state records. Amazon was cited with more than 24 violations over a three-year period in Northern Virginia, including running some of its generators without a basic environmental permit.
A few companies say they are using extensively re-engineered software and cooling systems to decrease wasted power. Among them are Facebook and Google, which also have redesigned their hardware. Still, according to recent disclosures, Google’s data centers consume nearly 300 million watts and Facebook’s about 60 million watts.
Many of these solutions are readily available, but in a risk-averse industry, most companies have been reluctant to make wholesale change, according to industry experts.
Improving or even assessing the field is complicated by the secretive nature of an industry that is largely built around accessing other people’s personal data.
For security reasons, companies typically do not even reveal the locations of their data centers, which are housed in anonymous buildings and vigilantly protected. Companies also guard their technology for competitive reasons, said Michael Manos, a longtime industry executive. “All of those things play into each other to foster this closed, members-only kind of group,” he said.
That secrecy often extends to energy use. To further complicate any assessment, no single government agency has the authority to track the industry. In fact, the federal government was unable to determine how much energy its own data centers consume, according to officials involved in a survey completed last year.
The survey did discover that the number of federal data centers grew from 432 in 1998 to 2,094 in 2010.
To investigate the industry, The Times obtained thousands of pages of local, state and federal records, some through freedom of information laws, that are kept on industrial facilities that use large amounts of energy. Copies of permits for generators and information about their emissions were obtained from environmental agencies, which helped pinpoint some data center locations and details of their operations.
In addition to reviewing records from electrical utilities, The Times also visited data centers across the country and conducted hundreds of interviews with current and former employees and contractors.
Some analysts warn that as the amount of data and energy use continue to rise, companies that do not alter their practices could eventually face a shake-up in an industry that has been prone to major upheavals, including the bursting of the first Internet bubble in the late 1990s.
“It’s just not sustainable,” said Mark Bramfitt, a former utility executive who now consults for the power and information technology industries. “They’re going to hit a brick wall.”
Bytes by the Billions
Wearing an FC Barcelona T-shirt and plaid Bermuda shorts, Andre Tran strode through a Yahoo data center in Santa Clara where he was the site operations manager. Mr. Tran’s domain — there were servers assigned to fantasy sports and photo sharing, among other things — was a fair sample of the countless computer rooms where the planet’s sloshing tides of data pass through or come to rest.
Aisle after aisle of servers, with amber, blue and green lights flashing silently, sat on a white floor punctured with small round holes that spit out cold air. Within each server were the spinning hard drives that store the data. The only hint that the center was run by Yahoo, whose name was nowhere in sight, could be found in a tangle of cables colored in the company’s signature purple and yellow.
“There could be thousands of people’s e-mails on these,” Mr. Tran said, pointing to one storage aisle. “People keep old e-mails and attachments forever, so you need a lot of space.”
This is the mundane face of digital information — player statistics flowing into servers that calculate fantasy points and league rankings, snapshots from nearly forgotten vacations kept forever in storage devices. It is only when the repetitions of those and similar transactions are added up that they start to become impressive.
Each year, chips in servers get faster, and storage media get denser and cheaper, but the furious rate of data production goes a notch higher.
Jeremy Burton, an expert in data storage, said that when he worked at a computer technology company 10 years ago, the most data-intensive customer he dealt with had about 50,000 gigabytes in its entire database. (Data storage is measured in bytes. The letter N, for example, takes 1 byte to store. A gigabyte is a billion bytes of information.)
Today, roughly a million gigabytes are processed and stored in a data center during the creation of a single 3-D animated movie, said Mr. Burton, now at EMC, a company focused on the management and storage of data.
Just one of the company’s clients, the New York Stock Exchange, produces up to 2,000 gigabytes of data per day that must be stored for years, he added.
EMC and the International Data Corporation together estimated that more than 1.8 trillion gigabytes of digital information were created globally last year.
“It is absolutely a race between our ability to create data and our ability to store and manage data,” Mr. Burton said.
About three-quarters of that data, EMC estimated, was created by ordinary consumers.
With no sense that data is physical or that storing it uses up space and energy, those consumers have developed the habit of sending huge data files back and forth, like videos and mass e-mails with photo attachments. Even the seemingly mundane actions like running an app to find an Italian restaurant in Manhattan or a taxi in Dallas requires servers to be turned on and ready to process the information instantaneously.
The complexity of a basic transaction is a mystery to most users: Sending a message with photographs to a neighbor could involve a trip through hundreds or thousands of miles of Internet conduits and multiple data centers before the e-mail arrives across the street.
“If you tell somebody they can’t access YouTube or download from Netflix, they’ll tell you it’s a God-given right,” said Bruce Taylor, vice president of the Uptime Institute, a professional organization for companies that use data centers.
To support all that digital activity, there are now more than three million data centers of widely varying sizes worldwide, according to figures from the International Data Corporation.
Nationwide, data centers used about 76 billion kilowatt-hours in 2010, or roughly 2 percent of all electricity used in the country that year, based on an analysis by Jonathan G. Koomey, a research fellow at Stanford University who has been studying data center energy use for more than a decade. DatacenterDynamics, a London-based firm, derived similar figures.
The industry has long argued that computerizing business transactions and everyday tasks like banking and reading library books has the net effect of saving energy and resources. But the paper industry, which some predicted would be replaced by the computer age, consumed 67 billion kilowatt-hours from the grid in 2010, according to Census Bureau figures reviewed by the Electric Power Research Institute for The Times.
Direct comparisons between the industries are difficult: paper uses additional energy by burning pulp waste and transporting products. Data centers likewise involve tens of millions of laptops, personal computers and mobile devices.
Chris Crosby, chief executive of the Dallas-based Compass Datacenters, said there was no immediate end in sight to the proliferation of digital infrastructure.
“There are new technologies and improvements,” Mr. Crosby said, “but it still all runs on a power cord.”
‘Comatose’ Power Drains
Engineers at Viridity Software, a start-up that helped companies manage energy resources, were not surprised by what they discovered on the floor of a sprawling data center near Atlanta.
Viridity had been brought on board to conduct basic diagnostic testing. The engineers found that the facility, like dozens of others they had surveyed, was using the majority of its power on servers that were doing little except burning electricity, said Michael Rowan, who was Viridity’s chief technology officer.
A senior official at the data center already suspected that something was amiss. He had previously conducted his own informal survey, putting red stickers on servers he believed to be “comatose” — the term engineers use for servers that are plugged in and using energy even as their processors are doing little if any computational work.
“At the end of that process, what we found was our data center had a case of the measles,” said the official, Martin Stephens, during a Web seminar with Mr. Rowan. “There were so many red tags out there it was unbelievable.”
The Viridity tests backed up Mr. Stephens’s suspicions: in one sample of 333 servers monitored in 2010, more than half were found to be comatose. All told, nearly three-quarters of the servers in the sample were using less than 10 percent of their computational brainpower, on average, to process data.
The data center’s operator was not some seat-of-the-pants app developer or online gambling company, but LexisNexis, the database giant. And it was hardly unique.
In many facilities, servers are loaded with applications and left to run indefinitely, even after nearly all users have vanished or new versions of the same programs are running elsewhere.
“You do have to take into account that the explosion of data is what aids and abets this,” said Mr. Taylor of the Uptime Institute. “At a certain point, no one is responsible anymore, because no one, absolutely no one, wants to go in that room and unplug a server.”
Kenneth Brill, an engineer who in 1993 founded the Uptime Institute, said low utilization began with the field’s “original sin.”
In the early 1990s, Mr. Brill explained, software operating systems that would now be considered primitive crashed if they were asked to do too many things, or even if they were turned on and off. In response, computer technicians seldom ran more than one application on each server and kept the machines on around the clock, no matter how sporadically that application might be called upon.
So as government energy watchdogs urged consumers to turn off computers when they were not being used, the prime directive at data centers became running computers at all cost.
A crash or a slowdown could end a career, said Michael Tresh, formerly a senior official at Viridity. A field born of cleverness and audacity is now ruled by something else: fear of failure.
“Data center operators live in fear of losing their jobs on a daily basis,” Mr. Tresh said, “and that’s because the business won’t back them up if there’s a failure.”
In technical terms, the fraction of a computer’s brainpower being used on computations is called “utilization.”
McKinsey & Company, the consulting firm that analyzed utilization figures for The Times, has been monitoring the issue since at least 2008, when it published a report that received little notice outside the field. The figures have remained stubbornly low: the current findings of 6 percent to 12 percent are only slightly better than those in 2008. Because of confidentiality agreements, McKinsey is unable to name the companies that were sampled.
David Cappuccio, a managing vice president and chief of research at Gartner, a technology research firm, said his own recent survey of a large sample of data centers found that typical utilizations ran from 7 percent to 12 percent.
“That’s how we’ve overprovisioned and run data centers for years,” Mr. Cappuccio said. “ ‘Let’s overbuild just in case we need it’ — that level of comfort costs a lot of money. It costs a lot of energy.”
Servers are not the only components in data centers that consume energy. Industrial cooling systems, circuitry to keep backup batteries charged and simple dissipation in the extensive wiring all consume their share.
In a typical data center, those losses combined with low utilization can mean that the energy wasted is as much as 30 times the amount of electricity used to carry out the basic purpose of the data center.
Some companies, academic organizations and research groups have shown that vastly more efficient practices are possible, although it is difficult to compare different types of tasks.
The National Energy Research Scientific Computing Center, which consists of clusters of servers and mainframe computers at the Lawrence Berkeley National Laboratory in California, ran at 96.4 percent utilization in July, said Jeff Broughton, the director of operations. The efficiency is achieved by queuing up large jobs and scheduling them so that the machines are running nearly full-out, 24 hours a day.
A company called Power Assure, based in Santa Clara, markets a technology that enables commercial data centers to safely power down servers when they are not needed — overnight, for example.
But even with aggressive programs to entice its major customers to save energy, Silicon Valley Power has not been able to persuade a single data center to use the technique in Santa Clara, said Mary Medeiros McEnroe, manager of energy efficiency programs at the utility.
“It’s a nervousness in the I.T. community that something isn’t going to be available when they need it,” Ms. McEnroe said.
ENERGY HUNGRY
Row after row after row of servers, at data centers around the world, perform the functions that constitute the cloud. They consume vast amounts of electricity, often wastefully. Credit Richard Perry/The New York Times
The streamlining of the data center done by Mr. Stephens for LexisNexis Risk Solutions is an illustration of the savings that are possible.
In the first stage of the project, he said that by consolidating the work in fewer servers and updating hardware, he was able to shrink a 25,000-square-foot facility into 10,000 square feet.
Of course, data centers must have some backup capacity available at all times and achieving 100 percent utilization is not possible. They must be prepared to handle surges in traffic.
Mr. Symanski, of the Electric Power Research Institute, said that such low efficiencies made sense only in the obscure logic of the digital infrastructure.
“You look at it and say, ‘How in the world can you run a business like that,’ ” Mr. Symanski said. The answer is often the same, he said: “They don’t get a bonus for saving on the electric bill. They get a bonus for having the data center available 99.999 percent of the time.”
The Best-Laid Plans
In Manassas, Va., the retailing colossus Amazon runs servers for its cloud amid a truck depot, a defunct grain elevator, a lumberyard and junk-strewn lots where machines compress loads of trash for recycling.
The servers are contained in two Amazon data centers run out of three buildings shaped like bulky warehouses with green, corrugated sides. Air ducts big enough to accommodate industrial cooling systems sprout along the rooftops; huge diesel generators sit in rows around the outside.
The term “cloud” is often generally used to describe a data center’s functions. More specifically, it refers to a service for leasing computing capacity. These facilities are primarily powered from the national grid, but generators and batteries are nearly always present to provide electricity if the grid goes dark.
The Manassas sites are among at least eight major data centers Amazon operates in Northern Virginia, according to records of Virginia’s Department of Environmental Quality.
The department is on familiar terms with Amazon. As a result of four inspections beginning in October 2010, the company was told it would be fined $554,476 by the agency for installing and repeatedly running diesel generators without obtaining standard environmental permits required to operate in Virginia.
Even if there are no blackouts, backup generators still emit exhaust because they must be regularly tested.
After months of negotiations, the penalty was reduced to $261,638. In a “degree of culpability” judgment, all 24 violations were given the ranking “high.”
Drew Herdener, an Amazon spokesman, agreed that the company “did not get the proper permits” before the generators were turned on. “All of these generators were all subsequently permitted and approved,” Mr. Herdener said.
The violations came in addition to a series of lesser infractions at one of Amazon’s data centers in Ashburn, Va., in 2009, for which the company paid $3,496, according to the department’s records.
Of all the things the Internet was expected to become, it is safe to say that a seed for the proliferation of backup diesel generators was not one of them.
Terry Darton, a former manager at Virginia’s environmental agency, said permits had been issued to enough generators for data centers in his 14-county corner of Virginia to nearly match the output of a nuclear power plant.
“It’s shocking how much potential power is available,” said Mr. Darton, who retired in August.
No national figures on environmental violations by data centers are available, but a check of several environmental districts suggests that the centers are beginning to catch the attention of regulators across the country.
Over the past five years in the Chicago area, for example, the Internet powerhouses Savvis and Equinix received violation notices, according to records from the Illinois Environmental Protection Agency. Aside from Amazon, Northern Virginia officials have also cited data centers run by Qwest, Savvis, VeriSign and NTT America.
Despite all the precautions — the enormous flow of electricity, the banks of batteries and the array of diesel generators — data centers still crash.
Amazon, in particular, has had a series of failures in Northern Virginia over the last several years. One, in May 2010 at a facility in Chantilly, took businesses dependent on Amazon’s cloud offline for what the company said was more than an hour — an eternity in the data business.
Pinpointing the cause became its own information glitch.
Amazon announced that the failure “was triggered when a vehicle crashed into a high-voltage utility pole on a road near one of our data centers.”
As it turns out, the car accident was mythical, a misunderstanding passed from a local utility lineman to a data center worker to Amazon headquarters. Instead, Amazon said that its backup gear mistakenly shut down part of the data center after what Dominion Virginia Power said was a short on an electrical pole that set off two momentary failures.
Mr. Herdener of Amazon said the backup system had been redesigned, and that “we don’t expect this condition to repeat.”
The Source of the Problem
Last year in the Northeast, a $1 billion feeder line for the national power grid went into operation, snaking roughly 215 miles from southwestern Pennsylvania, through the Allegheny Mountains in West Virginia and terminating in Loudon County, Va.
The work was financed by millions of ordinary ratepayers. Steven R. Herling, a senior official at PJM Interconnection, a regional authority for the grid, said the need to feed the mushrooming data centers in Northern Virginia was the “tipping point” for the project in an otherwise down economy.
Data centers in the area now consume almost 500 million watts of electricity, said Jim Norvelle, a spokesman for Dominion Virginia Power, the major utility there. Dominion estimates that the load could rise to more than a billion watts over the next five years.
Data centers are among utilities’ most prized customers. Many utilities around the country recruit the facilities for their almost unvarying round-the-clock loads. Large, steady consumption is profitable for utilities because it allows them to plan their own power purchases in advance and market their services at night, when demand by other customers plummets.
Mr. Bramfitt, the former utility executive, said he feared that this dynamic was encouraging a wasteful industry to cling to its pedal-to-the-metal habits. Even with all the energy and hardware pouring into the field, others believe it will be a challenge for current methods of storing and processing data to keep up with the digital tsunami.
Some industry experts believe a solution lies in the cloud: centralizing computing among large and well-operated data centers. Those data centers would rely heavily on a technology called virtualization, which in effect allows servers to merge their identities into large, flexible computing resources that can be doled out as needed to users, wherever they are.
One advocate of that approach is Mr. Koomey, the Stanford data center expert. But he said that many companies that try to manage their own data centers, either in-house or in rental spaces, are still unfamiliar with or distrustful of the new cloud technology. Unfortunately, those companies account for the great majority of energy usage by data centers, Mr. Koomey said.
Others express deep skepticism of the cloud, saying that the sometimes mystical-sounding belief in its possibilities is belied by the physicality of the infrastructure needed to support it.
Using the cloud “just changes where the applications are running,” said Hank Seader, managing principal for research and education at the Uptime Institute. “It all goes to a data center somewhere.”
Some wonder if the very language of the Internet is a barrier to understanding how physical it is, and is likely to stay. Take, for example, the issue of storing data, said Randall H. Victora, a professor of electrical engineering at the University of Minnesota who does research on magnetic storage devices.
“When somebody says, ‘I’m going to store something in the cloud, we don’t need disk drives anymore’ — the cloud is disk drives,” Mr. Victora said. “We get them one way or another. We just don’t know it.”
Whatever happens within the companies, it is clear that among consumers, what are now settled expectations largely drive the need for such a formidable infrastructure.
“That’s what’s driving that massive growth — the end-user expectation of anything, anytime, anywhere,” said David Cappuccio, a managing vice president and chief of research at Gartner, the technology research firm. “We’re what’s causing the problem.”
_________________________________________
When our labor union and justice organization leaders come out locally shouting JOBS, JOBS, JOBS, HOUSING FOR THE POOR, HOUSING FOR THE POOR----and never address these gorilla-in-the-room issues which harm everyone but especially the poor----demanding those jobs tied to building these structures is like having the convicted man build his own gallows.
THESE IS NO JUSTICE---THERE IS NO ENVIRONMENTALISM---THERE WILL BE ALMOST NO PERMANENT EMPLOYMENT AS THESE JOBS ARE BUILT TO BE ROBOTIC AND COMPUTER-DRIVEN.
One top of all that-----these entire CORPORATE SUSTAINABILITY policies are geared towards only the future of the 1% and their 2%.
Energy
The Surprisingly Large Energy Footprint of the Digital Economy [UPDATE]
Our computers and smartphones might seem clean, but the digital economy uses a tenth of the world's electricity — and that share will only increase, with serious consequences for the economy and the environment
By Bryan Walsh @bryanrwalshAug. 14, 2013
Which uses more electricity: the iPhone in your pocket, or the refrigerator humming in your kitchen? Hard as it might be to believe, the answer is probably the iPhone. As you can read in a post on a new report by Mark Mills — the CEO of the Digital Power Group, a tech- and investment-advisory firm — a medium-size refrigerator that qualifies for the Environmental Protection Agency’s Energy Star rating will use about 322 kW-h a year. The average iPhone, according to Mills’ calculations, uses about 361 kW-h a year once the wireless connections, data usage and battery charging are tallied up. And the iPhone — even the latest iteration — doesn’t even keep your beer cold. (Hat tip to the Breakthrough Institute for noting the report first.)
The iPhone is just one reason why the information-communications-technologies (ICT) ecosystem, otherwise known as the digital economy, demands such a large and growing amount of energy. The global ICT system includes everything from smartphones to laptops to digital TVs to — especially — the vast and electron-thirsty computer-server farms that make up the backbone of what we call “the cloud.” In his report, Mills estimates that the ICT system now uses 1,500 terawatt-hours of power per year. That’s about 10% of the world’s total electricity generation or roughly the combined power production of Germany and Japan. It’s the same amount of electricity that was used to light the entire planet in 1985. We already use 50% more energy to move bytes than we do to move planes in global aviation. No wonder your smartphone’s battery juice constantly seems on the verge of running out.
As our lives migrate to the digital cloud — and as more and more wireless devices of all sorts become part of our lives — the electrons will follow. And that shift underscores how challenging it will be to reduce electricity use and carbon emissions even as we become more efficient.
Here’s an example: the New Republic recently ran a story arguing that the greenest building in New York City — the Bank of America Tower, which earned the Leadership in Energy and Environmental Design’s (LEED) highest Platinum rating — was actually one of the city’s biggest energy hogs. Author Sam Roudman argued that all the skyscraper’s environmentally friendly add-ons — the waterless urinals, the daylight dimming controls, the rainwater harvesting — were outweighed by the fact that the building used “more energy per square foot than any comparably sized office building in Manhattan,” consuming more than twice as much energy per square foot as the 80-year-old (though recently renovated) Empire State Building.
Why did an ultra-green tower need so much electricity? The major culprit was the building’s trading floors, full of fields of energy-thirsty workstations with five computers to a desk:
Assuming no one turns these computers off, in a year one of these desks uses roughly the energy it takes a 25-mile-per-gallon car engine to travel more than 4,500 miles. The servers supporting all those desks also require enormous energy, as do the systems that heat, cool and light the massive trading floors beyond normal business hours. These spaces take up nearly a third of the Bank of America Tower’s 2.2 million total square feet, yet the building’s developer and architect had no control over how much energy would be required to keep them operational.
I think — and others agree — that the TNR article was unfair. There’s lots of silliness in the LEED ratings system — see this Treehugger post for evidence — but it’s not the Bank of America building itself that’s responsible for that massive carbon footprint. It’s what’s being done inside the building, as those hardworking computers suck electricity 24 hours a day, seven days a week. The fact that a skyscraper with so many cutting-edge, energy-efficient features can still use so much energy because it needs to play a full-time role in the cloud underscores just how electricity-intensive the digital economy can be.
That’s because the cloud uses energy differently than other sectors of the economy. Lighting, heating, cooling, transportation — these are all power uses that have rough limits. As your air conditioner or lightbulb becomes more efficient, you might decide to then use them more often — in energy efficiency, that is what’s known as the rebound effect. But you can only heat your home so much, or drive so far before you reach a period of clearly diminishing returns. Just because my Chevy Volt can get 100 miles per gallon doesn’t mean I’m going to drive back and forth to Washington each day. So it stands to reason that as these appliances become more efficient, we can potentially limit and even reduce energy consumption without losing value — which is indeed what’s happened in recent years in the U.S. and other developed nations.
But the ICT system derives its value from the fact that it’s on all the time. From computer trading floors or massive data centers to your own iPhone, there is no break time, no off period. (I can’t be the only person who keeps his iPhone on at night for emergency calls because I no longer have a home phone.) That means a constant demand for reliable electricity. According to Mills, efficiency improvements in the global ICT system began to slow around 2005, even as global data traffic began to spike thanks to the emergence of wireless broadband for smartphones and tablets. As anyone who has ever tried to husband the battery of a dying smartphone knows, transmitting wireless data — whether via 3G or wi-fi — adds significantly to power use. As the cloud grows bigger and bigger, and we put more and more of our devices on wireless networks, we’ll need more and more electricity. How much? Mills calculates that it takes more electricity to stream a high-definition movie over a wireless network than it would have taken to manufacture and ship a DVD of that same movie.
Look at our smartphones: as they become more powerful, they also use more power. Slate’s Farhad Manjoo called this the “smartphone conundrum” in a piece earlier this year:
Over the next few years, at least until someone develops better battery technology, we’re going to have to choose between smartphone performance and battery life. Don’t worry — phones will keep getting faster. Chip designers will still manage to increase the speed of their chips while conserving a device’s power. The annual doubling in phone performance we’ve seen recently isn’t sustainable, though. Our phones are either going to drain their batteries at ever increasing rates while continuing to get faster — or they’re going to maintain their current, not-great-but-acceptable battery life while sacrificing huge increases in speed. It won’t be possible to do both.
And that’s just our phones. What’s unique about the ICT system is that companies keep introducing entirely new product lines. In 1995, you might have had a desktop computer and perhaps a game system. In 2000, maybe you had a laptop and a basic cell phone. By 2009, you had a laptop and a wireless-connected smartphone. Today you may well have a laptop, a smartphone, a tablet and a streaming device for your digital TV. The even more connected might be wearing a Fitbit tracker, writing notes with a wi-fi-enabled Livescribe pen and tracking their runs with a GPS watch. And there will certainly be more to come, as the best minds of our generation design new devices for us to buy. In a piece yesterday, Manjoo reviewed the Pebble, the first — but almost certainly not the last — major “smartwatch.” At a moment when young people are buying fewer cars and living in smaller spaces — reducing energy needs for transportation and heating/cooling — they’re buying more and more connected devices. Of course the electricity bill is going to go up.
None of this is to argue that energy efficiency isn’t important in the ICT sector. Just as the Bank of America Tower’s green features keep its gigantic electricity demand from ballooning even more, efficient smartphones and laptops can slow the growth of the cloud’s carbon footprint. But grow it will. Energy efficiency has never been a big part of the sales strategy for digital devices, probably because electricity is still cheap in the U.S. and it’s something we pay for in bulk at the end of the month. Compare the feeling of paying your utility bill to the irritation of forking out $3.50 a gallon to fill up your car. The costs of electricity are hidden in our society.
That includes the environmental costs. The full title of Mills’ report is The Cloud Begins With Coal: Big Data, Big Networks, Big Infrastructure and Big Power, and it’s sponsored by the National Mining Association and the American Coalition for Clean Coal Electricity. Unsurprisingly, the report argues that coal — still the single biggest source of electricity in the U.S. — essentially powers our wonderful cloud. (And it is wonderful! The cloud generates a lot of value for all the electricity it uses.) Coal is hardly the only source of electricity that can keep the ICT system going — cleaner natural gas is already gaining, nuclear provides carbon-free base-load power, and renewables are growing fast. Certain aspects of the ICT system will also help reduce energy use, as smart grids and smart meters promote conservation. But users of the wireless cloud are likely to grow from 42.8 million people in 2008 to nearly 1 billion in 2014 — and that’s just the beginning, as smartphones spread from the developed to the developing world. We already have a gigantic digital cloud, and it’s only going to get bigger. What we need is a cleaner one.
[Update: Along those lines, digital companies have been taking steps to clean the cloud by procuring more of their energy from low-carbon sources. Apple’s data centers, for instance, are 100% powered by renewable energy, and is working to increase renewable energy use overall. Google gets 34% of its energy for operations from renewable sources.
Smart companies are looking to cite power-hungry data centers near reliable sources of renewable energy:
large hydro plants, like the ones near the new data center Facebook recently opened in Sweden, or utility-scale wind farms. Ultimately, though, it’s less the responsibility of the companies themselves then the economy as a whole to make the shift to cleaner energy. As more and more people buy more and more cloud-connected devices—and as electric cars and other forms of electrified transport replace petroleum-powered vehicles—the demand for electricity will grow. It’s up to us to push to make it cleaner.]
*A note on the calculations on smartphone energy use. This comes from an email by Max Luke, a policy associate at the Breakthrough Institute, which posted on Mills’ study:
Last year the average iPhone customer used 1.58 GB of data a month, which times 12 is 19 GB per year. The most recent data put out by a ATKearney for mobile industry association GSMA (p. 69) says that each GB requires 19 kW. That means the average iPhone uses (19kw X 19 GB) 361 kwh of electricity per year. In addition, ATKearney calculates each connection at 23.4 kWh. That brings the total to 384.4 kWh. The electricity used annually to charge the iPhone is 3.5 kWh, raising the total to 388 kWh per year. EPA’s Energy Star shows refrigerators with efficiency as low as 322 kWh annually.
Breakthrough ran the numbers on the iPhone specifically—the Mills’ endnotes (see page 44 in the report) refer to smartphones and tablets more generally—but Luke notes that Mills confirmed the calculations.
As I noted in the update at the top of the post, these estimates are at the very high end—other researchers have argue that power use by smartphones is much lower. And the Mills study itself has come in for strong criticism from other experts, as this MSN post notes:
Gernot Heiser, a professor at the University of New South Wales in Sydney and co-author of a 2010 study on power consumption in smartphones, echoed Koomey’s sentiments that Mills’ work was flawed.
Writing to MSN News, Heiser said Mills’ work “seems blatantly wrong.” He said Mills overestimates the amount of power used by a modern smartphone, in this case a Galaxy S III, by more than four times.
“I’d have to have a quick look to see how they arrive at this figure, but it certainly looks like baloney to me,” Heiser said.
Gang Zhou, an associate professor of computer science at the College of Williams and Mary, was less direct in attacking Mills’ claims, but nonetheless said his measurements for the power consumption of smartphones was at least “one or two magnitude” higher than they should be. Nonetheless, Zhou said the subject of data center electricity usage is an important issue and it “should raise concern.”
Still, I think the takeaway from this isn’t about the energy use of individual brands or even whole classes of devices. The point is that as our always-on digital economy grows more extensive—and it will—we need to be more aware of the energy demands that will follow. The study from CEET in Melbourne that I noted in the update at the top of the post assumes much lower power consumption by individual devices than Mills’ work, but it still raises the alarm about the growing energy demand from cloud services.
As I write above, the nature of a smartphone or a tablet makes it hard to realize how much energy it may be using—especially given the fact that the electricity is often produced at plants far away from our outlets. At a gas station, for instance, the immediate cost and the smell of petrol is a potent reminder that we’re consuming energy. The digital economy is built on the sensation of seamlessness—but it still comes with a utility bill.