As we continue to progress and understand our world to create a stronger future for generations to come, we are beginning to understand the environmental impact our energy needs have. It has become important to reduce energy consumption in all areas, from the individual consumer to all forms of industry.
In the IT field this reduction is referred to as Green computing, a quite general term which relates to the responsible and ethical use of an organization’s IT resources. As IT resources have become an essential part to most organizations, it is increasingly important to utilize them as efficiently as possible.
Green computing is not only about reducing greenhouse emissions, it also translates to a significant measurable monetary savings as well. An aggressive environmental use policy will significantly reduce an organization’s overall operating costs. For this reason alone a use policy needs to be approached in a serious manner.
Measuring Company’s Footprint
The first step towards knowing your own environmental footprint is to measure everything you possibly can. CPU load, user activity, outdoor temperature, and lighting use are just a few examples of what can be measured. The data you collect is useful in understanding how your energy is used and where reductions can be made.
Tools already exist to monitor everything in your physical system (for example temperature and power) to seeing how resource heavy applications can be (CPU, RAM, and HDD usage) [CSS08]. This will data provide you with the information you need to understand what sort of costs, and what kind of environmental impact your organization currently has. It will also help identify where major inefficiencies exist, and where you will need to shift your focus in order to reduce energy usage while improving efficiency.
Using this data it is possible to create a cost profile [CSS08]. A cost profile will help breakdown costs into categories. With the understanding of how much energy a component uses on a yearly basis, and how much that energy costs, we can make an associated cost profile for each individual component. We can also break this down further into application energy cost.
If we know for example, that it cost 100 euros to run a workstation’s CPU yearly, then we know that an application that consumes 20 percent of that CPU load is costing an organization 20 euros a year. When looking at a larger scale, such as a server room, or server farm, it is quick to see that efficient programming translates to lower operational costs [CSS08]. Where with a workstation you are simply dealing with one CPU, a server farm may consist of thousands. Let’s say we improve our software application’s code efficiency by 25%. The 5 euros we save a year on an individual workstation could easily be a 5000 euro reduction across a server farm.
Reducing excess usage
Cost reduction is one of the most attractive aspects of green computing. If we look at areas of energy costs, these can be divided into four major categories [Lew08]: cooling, computation, power conversion, and hoteling. These categories are never clear cut, tending to be deeply tied together.
A reduction in one category directly relates to a reduction in another. For example, the introduction of efficient power conversion reduces energy use by its own merit, while additionally reducing heat, which in turn leads to lower cooling costs.
Computer resources generate a reasonable amount of heat, which needs to be taken care of as quick as possible in order to prevent heat buildup. Workstation and server components tend to have higher failure rates as the temperature rise.
Cooling systems for server rooms are heavy energy consumers, and need to be studied carefully in order to integrate an energy efficient design into the system architecture [Lew08]. Decision’s early on into a server room’s infrastructure need to be made in order to allow for cheaper and more efficient cooling.
Areas with colder climates have been of interest for this very reason. Traditionally, heat is transferred directly into the surrounding environment. Savings can come from the idea that the heat can be put to better use. We can pump the heat throughout the building to reduce heating costs. We can also create a new use for the excess heat, such as warming a greenhouse.
Computer systems use a significantly greater amount of resources while they are actively running applications than when they sit idle. Inefficiencies in an application’s coding leads to higher resource usage within a given system, directly correlating with increased energy use and heat. Computer systems have a variety of power draining components, including the CPU, RAM, HDD, fans and so on. As it stands the highest energy consumer in a computer system is the CPU. Cleaning out these inefficiencies can save an organization a significant amount of operating costs [CSS08].
Although too much idle time is not ideal, reducing the number of operating servers will quickly cut energy requirements of any facility [CSS08]. Many operating systems by default will be running more services and applications in the case that they might be needed, but are not always useful for all users. It would be wise to take a careful look at what is required for a particular user or organization, and then proceed to eliminate the services and applications that are not required.
For example, a typical user may need a GUI to work, but why waste critical resources on running a GUI on a server that may never be physically be accessed? Tweaking systems to run with only what is necessary will help reduce the resource load.
Power conversion (Changing from AC to DC power) can be incredibly inefficient, and should be looked at as one area of major efficiency gains in a system as a whole [Lew08]. Simply changing power supplies in servers and workstations to more efficient models will result in immediate energy saving, as well as heat reduction.
Hoteling refers to the general resources in place to keep any facility accessible to people. This can include server rooms or computer labs. Basically, what do I, a physical living human being need when I am around? The most significant energy consumer here is lighting. This will reduce a small portion of unnecessary energy use and heat creation.
Moving towards more efficient lighting sources (such as LED or CFL lighting, as opposed to incandescent), turning off non-critical workstations and servers, along with generally shutting off all resources associated with hoteling when people are not in the facility [Lew08] can lead to significant energy reduction.
Before take drastic energy cost cutting actions are taken, you need to look into what your organization requires. Redundancy of network equipment is a crucial, allowing your organization to continue operation during moments of network shortages and problems.
On the other hand, redundancy creates complexities in a system design, which, if not done carefully, increases energy needs significantly. Additional servers, redundant power supplies, and disk arrays all add to the total energy use of an organization.
A decision will have to be made on how to balance the organization’s needs in order to have continued stability while reducing energy costs and its environmental impact.
If we break down the components of any computer system, we will find that different components require different amounts of energy to run. When thinking about how to better balance component use, virtualization plays an important role.
Virtualization allows one to use system hardware in non-traditional ways, which in turn makes further energy savings possible [FrP08]. For this reason virtualization is an interesting topic, as it has become an ever popular way within organizations to handle their own energy needs.
For example, in a virtual environment we can create a RAM disk, allowing the system to run entirely within memory. This offloads much of the work that would traditionally take place on a hard disk and moves local data storage over to the far more energy efficient RAM [FrP08].
Although this sort of virtualization can also be done on user workstations, limited RAM availability coupled with the need to prevent data loss does not make this an ideal solution. However in server environments, where energy saving can be more dramatic and resources more plentiful, virtualization is becoming an energy saving standard.
Eliminate idle time
Significant energy savings are not only to be found on the server side, but also on the client side. If client workstations are consistently running twenty-four hours a day, seven days a week, their energy consumption may prove to be many times more costly than purchase of the workstations themselves. Generally client workstations do not need to be running all the time.
Current Advanced Configuration and Power Interface (ACPI) standards allow for a multitude of options when it comes to energy savings. The ACPI provides an open standard for power management, allowing operating systems to place components in different power states. For example, by disabling screen-savers and instead creating a power plan that automatically shuts off monitors and puts workstations to sleep when not in use.
Within an organization, separate Workstation components should also be taken into account. This can be done by selecting power efficient processors and graphics processors, as well as disabling unused components (sound card, modem, wireless networking, etc.).
A users computer needs should be balanced with energy efficiency. Output devices such as monitors are an area of energy reduction. An Organic light-emitting diode (OLED) monitor promises to provide a solution using environmentally friendly materials, and greatly reducing energy consumption in comparison to older technologies.
Although a change to a single user’s workstation may seem insignificant (and most likely is), these changes repeated over what may be thousands of workstations will add up to a significant operating cost. Workstation and server idle time is a waste of an organization’s resources; an organization’s should aim to reduce idle time down to a minimum, while avoiding interfering with a user’s productivity.
An organization’s environmental impact is not only defined by energy consumption. How an organization manages its electronic waste needs to be taken into account in any environmental use policy.
Electronic components carry large amounts of lead that, when dumped, are toxic to our environment. Choosing to purchase components that are based on silver or copper can reduce the toxicity of any given component, but the increased cost along with the reduced reliability makes this an unattractive option [CBN09].
Plastic found in nearly every device adds to the negative impact of disposal [CBN09]. Reduction in the plastics used, or even better, replacement of plastics with more environmentally friendly options such as bio-degradable plastics should be sought out [CBN09]. It is important to reduce the physical systems the organization requires to operate, and reuse systems wherever possible in order to reduce waste.
The manufacturing of computer components has a very heavy environmental impact in a products lifecycle. From a manufacturing point of view, increasing recycling and reducing the use of toxic materials is necessary for an environmentally friendly approach.
Dell has taken an aggressive approach to manufacturing within their own organizations’ facilities. With 21 percent of all their energy needs from green energy sources as well as recycling 95 percent of all waste within their facilities [PiC11]. Toshiba has taken a strong approach to creating more environmentally friendly products.
After manufacturing, actual product energy consumption can leave a bigger footprint then the actual manufacturing. Toshiba has actively placed power efficient components into their products [PiC11]. These changes are not changes that a customer would notice unless told. It shows a commitment from computer manufacturers to reduce their own environmental impact.
Speed No Longer a Concern
It is important to note that as computer systems continue to rapidly improve in performance, users will no longer concern themselves with the speed of their devices, but begin to switch their attention to other aspects.
With an increasingly environmentally aware society, consumer demand is shifting focus to more socially and environmentally friendly options [PiC11]. This shift encourages energy saving and cost cutting techniques from every aspect of the IT world, from manufacturing to service based facilities. The increase in demand has also created a market demand that is quickly being filled by various environmentally aware competitors.
Companies such as VIA have introduced very low powered CPUs, Apple actively promotes its reduction of toxic materials in its products [App11], and Dell has moved towards bamboo packaging, even as far as using bamboo in some of its products [Del11].
Start a Green IT Policy!
Green computing is a fairly new topic with a lot of research that has yet to come out. As mentioned several times, green computing is a necessity in an ever increasing environmentally aware society.
For an organization green IT leads to significant cost savings. In order for green computing to effectively reduce an organization’s environmental footprint, it has to be integrated into the organizations’ architecture. Every detail needs to be carefully inspected, how energy is used on the server side to the client side. It is something that needs to be looked at and accounted for when any change is made in order to be continually effective.
In today’s world environmental impact has to be closely monitored. Not only is this important to the livelihood of an organization in terms of financial cost, it also carries a measurable social cost as well.
Green computing is increasingly tied into an organization’s needs, as the IT industry continues to push forward, the environmental aspect will play an ever increasing role.
App11 Apple Inc., Apple Environmental Reports. (2011) http://www.apple.com/environment/reports/
CSS08 Chheda, R., Shookowsky, D., Stefanovich, S., Toscano, J., Profiling Energy Usage for Efficient Consumption. The Architecture Journal, 18 (2008), pp. 24- 27.
CBN09 Chakraborty, P., Bhattacharyya, D., Nargiza, S., Bedajna, S.. International Journal of Grid and Distributed Computing , 2, 3, (2009), pp. 33-38
Del11 Dell Inc., Greener Products and Packaging article. (2011) http://content.dell.com/us/en/corp/d/corp-comm/bamboo-packaging
FrP08 Francis, F., Richardson, P., Green Maturity Model for Virtualization. The Architecture Journal, 18 (2008), pp. 9-15
Lew08 Lewis , C., Environmentally Sustainable Infrastructure Design. The Architecture Journal, 18 (2008), pp. 2-8.
PiC11 Pichetpongsa, N., Campeanu, G., Analysis of Green Information Technology in Dell and Toshiba Companies. Research article (2011)