Tag Archives: energy

Wearable Technology…. YES!!!

30 May


From MEMS technology to the nuances of “Things to Wear”.  We are finding more ways to stay connected.  The notion of the funnel of “Big Brother” is now beginning to be inverted to “Big Individual”.  Small personal devices link to Clouds and virtual life tools, specifically tailored to the individual.  Big Brother has No Clothes and No teeth.  But who is really paying close attention to this transformation.  Well, I am.  I hope you are as well.  Let me know what you think about the death of the Big Brother syndrome.

This is the start of the Internet of Things (IoT) and leading to the Internet of Everything (IoE).  Energy independence and fair costs based upon the new energy economy.  SD1 Incorporated and it’s new LEXI™ Bundled services platform is the forerunner of things to come.

Stay Tuned.

Internet of Things (IoT), INTEL and INTEL stakeholders and developers. Internet of Everything (IoE) Cisco and Cisco stakeholders developers.


1 A

28 Mar

1 A

We need to seek to be 1 A in our effort to solve the challenges inherent with Climate Change.

Here is a recent graphic generated by Michelle St.P (Ross) for SD1 Incorporated. The 1 A position in service is a take off from the selective service classification of the United States. [http://en.wikipedia.org/wiki/Selective_Service_System] However, the 1 A listing can apply to being available for service in the effort to face the 21st Century challenges of Climate Change and new energy technology for our human family. Climate Change can not be ignored. We must work together on solutions not disputes.

http://sd1.co, http://facebook.com/sd1, Project – SD1 R&D

A Key! Things that Inspire.

27 Aug

The Interior Department’s Beautiful Instagram Account

In four short years, the agency has transformed from digital dinosaur to social pioneer.

posted August 27, 2013


The Interior Department has figured out Instagram, thanks to director of digital strategy Tom Fullerton and the natural beauty of the great American outdoors. The agency “protects America’s natural resources and heritage, honors our cultures and tribal communities, and supplies the energy to power our future” and has taken to social media to engage citizens in that effort.

Because of its emphasis on imagery, Instagram has become a key part of the Interior Department’s social strategy. Fullerton makes getting 10,000 “likes” on an image seem easy, but the truth is that the agency has been working extremely hard to expand its digital presence, according to GovDelivery:

The U.S. Department of the Interior (DOI) Director of Digital Strategy, Tim Fullerton, knows a thing or two about making lemonade. When he took over the department’s digital communications in 2009, the DOI had no content management system for its website, no social media accounts, no video capabilities and no dedicated staff. It was, in essence, pre-historic. But with a lot of passion and a fine-tuned strategy, Fullerton has managed to transform the DOI’s web presence from a few lemons into a delicious John Daly cocktail with a twist.

Instagram presents the perfect opportunity for the Interior Department to engage directly with a young demographic. In just four years, the agency has accumulated 20,000 Facebook fans, 94,000 Twitter followers and 114,000 Instagram followers. Fullerton explained the DOI’s successful strategy to Alex Fitzpatrick in a recent Mashable article:

“We launched our Instagram just basically to show the public all of the amazing lands that we have across the country,” Fullerton said.

Most of the Interior Department’s Instagram photos come from department staff or from the public’s submissions to photo contests. Interior is running one such contest this summer — the “Summer in America’s Great Outdoors” project, which asks parkgoers to submit their park photography to this Flickr collection.

The Interior’s Instagram account posts a combination of native iPhone or other mobile photographs along with adapted DSLR imagery. Fullerton says he occasionally posts photos he sees tagged in national parks, though only when he’s “gotten explicit permission” from the photographer.

“If we’ve seen some really striking amazing images, we’ve emailed asking for permission,” he said. “We’re always sure to give full credit on Instagram.”

Fullterton also said his team “got a pretty positive response” to early experiments with Instagram video, so he’ll be looking for more opportunities to experiment with it in the future.

The department’s efforts are paying off. With more than 114,000 Instagram followers, the Interior Department is pioneering new strategies for government–citizen communication. Check out the images below to see exactly why the account is growing so quickly.

Keep up with the latest tech trends in federal government: Sign up for our e-newsletter
Related Article
Why the TSA Is Using Instagram
The Transportation Security Administration on the cutting edge of social media with a new Instagram account.
About the Author
Online Content Manager

Jimmy is a tech writer, marketer, photographer, SEO nerd and social media addict.  Follow him on Google+and Twitter.

A Key! Advertising and The new Shell push. Let’s go.

14 Aug

So, let’s get started. Or, in the alternative…. “Let’s go”

Campaigns are like that.  A reach for your heart, a reach for your mind to imagine and inspire and last a reach for you support.  All said, advertising has been with us since… well, a long time.  Uses of media change and methods to utilize emotion, memory, and even boring stuff like Geo-demographics play a key part in getting a message to you.  So you be the judge of the value of advertising, and, how advertising plays a key role in your decision or lifestyle.

Here is a link to one of Shell‘s latest. “Let’s go.”


Very nice and there is a “Privacy Policy”

(more to follow)



The internet of Things. M2M, Energy

9 Aug

SDDC, Incorporated believes in continuing the Discussion to solicit grand challenge thinking.  Management is pleased to share an interesting article recently published on Forbes.com.  We hope you are engaged in the M2M discussion.

Peter Kelly-Detwiler
Peter Kelly-Detwiler, Contributor
I cover the forces and innovations that shape our energy future.
Follow (85)
ENERGY | 8/06/2013 @ 7:57AM |4,197 views
Machine To Machine Connections – The Internet Of Things – And Energy
11 comments, 5 called-out Comment Now
Follow Comments

Image: barcodesinc.com

Much has been written in recent years about the growth of Machine to Machine (M2M) connections; also known as ‘The Internet of Things.” As with many concepts, it’s simple to conceptualize and much more difficult to put in place. And, as with many other emerging concepts, the value may not always be where you think it is.

In an effort to better understand where this trend is going in the energy space, I recently interviewed executives from Axeda, Wipro , and AT&T about their efforts in this arena. While there is huge progress being made in some areas, there is a great deal of work yet to be done.

Grossly simplified, the trio of Axeda, Wipro, and AT&T are working on this effort in the following way:

Axeda – a company located in Foxboro, MA, creates the software solutions in the cloud that product manufacturers use to collect data from their machines, analyze it, and integrate the data into their business systems. The manufacturers can then utilize this machine data to understand usage and behavior, build models, and figure out new ways to drive value. Dan Murphy, Axeda’s VP of marketing calls it “the digital umbilical cord providing connectivity from the end asset to the manufacturer or service provider.” The goal of that is to “give the manufacturer 24 x 7 knowledge as to what is going on with that product. The connected product is in an always-on relationship with respect to service delivery and data communication.”

AT&T needs no introduction, but you may not know how deeply involved it has been in the M2M movement over the past decade. According to Mobeen Khan, Executive Director of M2M Solutions at AT&T, the company has the largest number of connected devices – at 15.2 million as of last count, and also close to 1500 devices certified to run on their networks (in such areas as meters and equipment on gas turbines, one needs specialized devices). By providing connectivity to these machines, and through working with Axeda, Wipro and through enterprise cloud and analytic capabilities, AT&T is now also playing the role of a solutions integrator. Since the information must flow smoothly and seamlessly from machine to machine, the connectivity provided by AT&T is critical.

Wipro, for its part is an enormous multinational global IT, consulting, and outsourcing company engaged in numerous activities. Among other areas, they focus on media & telecom, manufacturing, healthcare, and energy and utilities. Together, AT&T, Axeda and Wipro provide end-to-end business solutions. Alan Atkins, Vice President and Global Head of M2M at Wipro characterizes the Axeda piece as “the best of breed enablement layer (not connectivity – that’s the role of telcos), but where you collect the data, connect objects, and enable different solutions.” Atkins is an expert in M2M and has also spent a good deal of time in energy. At the end of the day, he notes, the value of M2M is being driven by “a need to know, a need to control, and a need to secure.”

OK, so now you have the key elements: Assets that talk (through AT&T’s connectivity layer) to a cloud-based platform (Axeda’s Machine Cloud™) that allows the user to intelligently query the machines and ask the “so what?” (those are the solutions provided by Wipro).

So What? There are a lot of answers to that question, but let’s focus for a moment in the areas of power generation and consumption. The power grid is a complicated beast, involving millions of interconnected parts: power plants, transmission lines, transformers, distribution lines, and literally millions of energy consuming assets, from printing presses and air compressors to refrigerators and electric toothbrushes. Supply and demand must always be in balance (to avoid instability, including blackouts) and we currently don’t have cost-effective storage technologies for our electrons. Thus, there is a constant need to ramp up or ramp down generating power plants in order to meet fluctuating demand. This balancing act happens at regional or sub-regional levels, and electricity prices can vary tremendously based upon cost of energy supply, transmission constraints in getting electricity to areas of demand, as well as other factors.

Image: gereports.com

On the energy supply side, M2M is already well ensconced in areas of high value. For example, according to Axeda’s Murphy, in some areas GE sells ‘power by the hour,’ getting paid on a subscription basis for how many hours their engines are in use. He notes that one GE division has up to 250 sensors in each of its 5000 turbines, bringing back data in real-time to a centralized monitoring facility where they are on the lookout for leading issues, such as temperature on the bearings, vibrations, exhaust and other areas that signal the health of the machine. If the readings fall outside of a prescribed level, GE can do a pre-emptive fix. Murphy comments that this service can have a very high level of value in pre-emptively avoiding costs: replacing a bearing and being out of service for a day is far better than failure which might take a generating unit offline for 6-8 weeks. In volatile power markets, such an outage can be extremely costly. GE notes that “for some customers just one hour of stoppage time can cost $2 million in electricity output.”

On the energy demand side, integration of M2M will also have many uses. Many power markets are characterized by hourly pricing, and that may be getting even more granular in the near future. Wipro’s Atkins foresees a move in the Nordic countries to minute-by-minute billing. “At any given moment, the customer can look at what they are being charged for electricity. This will go with different price plans, and that means massive data handling. Auto-generation will let customers (with on-site generation which could range from diesel to solar) sell back into the grid. This will need to be monitored and controlled.” Other power consuming assets will also be connected to grid prices and respond accordingly. “SCADA has been very useful in control and measurement and delivery of power, but going forward we will need more knowledge and control, quickly packaged and formatted so that it can be used.” For this, the deployment of the M2M internet of things will be critical.

Of course, Atkins notes, “security is very key, and will be increased and increased. Encryption continues to get more complicated. The systems today are far more easily hacked into than the ones will be in the future. We look at very high levels of encryption in packaging data. If you are just sending analog data and it’s not encrypted, you have problems. But if it’s packaged and encrypted, you have a concrete block around the data, and the data cannot be read.”

AT&T’s Khan predicts “every single asset will have an embedded communications capability and some level of secure connectivity to the cloud. We will be able to optimize across three dimensions. In the short term, we will be able to look at current data and make diagnostic decisions – for example, do we shut something down, make a firmware upgrade, or pursue some other course? In the medium term, we can capture certain data that will tell you whether to optimize a service or the product itself. And in the long run, the data will contribute to making better products or delivering superior services. For example, if you use a tractor in the field, the tractor will be sending you back data over a few years that allows you to design a better tractor.”

The same thing will likely occur with assets that both produce and consume power. Devices that are ‘market aware’ and ‘know’ market prices and grid conditions are likely inevitable, since there is potentially so much value to be gained (or cost to be avoided). Such a movement is already beginning to gain traction with various demand response providers and controls companies, and will likely pick up speed as the constituent M2M costs – connectivity, sensors, etc., continue to come down.

Khan notes “In the area of electricity, there is already lots of optimization going on. We have smart metering and smart meter data management solutions, and outage, restoration, and notification management.” Axeda’s Dan Murphy acknowledges that much of the solution remains in a future state, but believe the transformation in the power industry has just started and will pick up speed quickly. “We think connectivity with the power grid is going to happen within five years. We already work directly with 150 of the Fortune1000 to put intelligence in their machines. Folks like Wipro will be the ones assembling the data for the end use customers. I expect this market to progress very rapidly.”

I asked these observers to reflect on the biggest surprise with M2M to date. AT&T’s Khan comments, “What we are finding with customers employing solutions is that they start with a specific business case. They get access to data they didn’t know what to do with before and they start to build multiple business cases…they are using more data and getting more returns on investment than they initially anticipated.”

In summary, the M2M, Internet of Things world has received a good deal of hype, with some folks still asking “so what, when are these promises going to be realized?” In energy and elsewhere, the tools are improving, the trend appears to be picking up speed, companies like GE are already reaping rewards in certain areas, and real gains at a very broad scale may be just around the corner.

A Key

10 Jul

Intellectual Property: A Key Driver of our Economy

Posted by Victoria Espinel on June 20, 2013 at 08:48 AM EDT

Innovation and creativity have always been the foundation of our economy, and effective enforcement of intellectual property rights enables us to promote economic growth, ensure our global competitiveness, and protect the health and safety of our citizens. Today’s release of the Administration’s 2013 Joint Strategic Plan for Intellectual Property Enforcement builds on our efforts to protect intellectual property to date, and provides a roadmap for our work over the next three years. In preparing the 2013 Joint Strategic Plan, we solicited public comment on how to improve our approach, and that public input was invaluable in drafting the final version of the Joint Strategic Plan. We will continue to seek public views on how to best promote and protect intellectual property rights.

Intellectual property is a key driver of our economy. So it matters that we have the right approach to intellectual property enforcement; one that is thoughtful, dedicated and effective, and that makes good and efficient use of our resources.

Ours is a Nation of entrepreneurs, inventors, and artists. The ideas that American citizens generate catalyze cutting edge research, ensure longer and healthier lives, and power the globe’s most productive economy. Our ingenuity and entrepreneurial spirit make the United States great, and we must fiercely defend that competitive advantage. As President Obama has said, “If the playing field is level, I promise you – America will always win.”

Since the first Joint Strategic Plan was released in 2010, the Administration has made tremendous progress in intellectual property enforcement. Coordination and efficiency of the Federal agencies has improved; U.S law enforcement has increased significantly and we have successfully worked with Congress to improve our legislation. We have increased our focus on trade secret theft and economic espionage that give foreign governments and companies an unfair competitive advantage by stealing our technology. We have pressed our trading partners to do more to improve enforcement of all types of intellectual property. We have encouraged the private sector to do more on a voluntary basis to make online infringement less profitable as a business, consistent with due process, free speech, privacy interests of users, competition law and protecting legitimate uses of the Internet.

Moving forward, we remain committed to protecting intellectual property and are building on the approach set out in the original Strategy. For example, we will continue to look for ways to make enforcement as coordinated and efficient as possible. We will look for ways to further increase transparency and outreach to a broad range of interests and views. We will continue to encourage companies to take voluntary steps to reduce the profit incentive from online infringement, consistent with due process, free speech, privacy interests and competition law, and we will also encourage rightholders to agree to a set of best practices to reduce infringement online.

We will review our domestic legislation to make sure it is effective and up-to-date. We will look for ways to use technology better to make enforcement more efficient and targeted. We want to discourage infringement and encourage those that are appropriately building on the works of others to create new works, so we will educate authors on how fair use works to allow creation of new works. We will increase support for small and medium sized companies that are seeking to expand into foreign markets. And we will begin collecting information on labor conditions in the manufacture and distribution of counterfeit and pirated goods overseas.

These are just some of the important initiatives that are set forth in the Joint Strategic Plan – for a complete list of all items in the 2013 strategy, see page 10.

I want to highlight two areas where we are looking for additional public input. First, we want to make sure that enforcement of patents at the border is as efficient and transparent possible so we are seeking views on how to improve that process. Also we want to know if the voluntary initiatives we have encouraged to reduce online infringement are working well and having a positive impact. So to that end, today the U.S. Patent and Trademark Office is asking the public for input on the best way to assess the effectiveness of voluntary initiatives. I encourage you to let us know your views. Public input is critical to ensure that we maintain the right approach moving forward.

I look forward to working with you to further enforce and protect American intellectual property rights. With continued leadership by the Administration and the support of Congress, the American people will continue to lead the world in innovation, and this innovation will continue to fuel our economy.

Learn more about today’s release:

Victoria Espinel is the U.S. Intellectual Property Enforcement Coordinator

<a href=”https://plus.google.com/105602551135487674998&#8243; rel=”publisher”>Google+</a>

Robert F. Kennedy Jr.: Renewable Energy Is Key to U.S. Growth

By  | Daily Ticker – Tue, Oct 23, 2012 10:15 AM EDT

Follow The Daily Ticker on Facebook!

One of the most important issues in this year’s election is energy.

Our ongoing addiction to Mideast oil leaves us dependent on countries that are often unstable and hostile. Developing our own domestic energy resources and investing in renewable energy lessens this dependence. It also has the potential to create jobs and improve our trade deficit.

The two presidential candidates have laid out energy plans that sound similar: both President Obama and Governor Romney want to continue to develop domestic energy resources, including renewable energy, with the aim of making the U.S. less dependent on foreign oil.

But according to Robert F. Kennedy Jr., the president of environmental group Waterkeeper Alliance, the plans are different in several important ways. And President Obama’s plan, Kennedy says, is much better for the country.

“We need to be energy independent but we can’t look into the future by looking in a rearview mirror and say that we’re going to do that through carbon,” Kennedy says in an exclusive interview with The Daily Ticker. The idea that there’s not a future for wind and solar energies in the U.S. “is just a hoax.”

Related: Clean Energy: Obama Says It’s the Future, Paul Ryan Calls It a Fad

Kennedy gives the example of a solar plant being built in the Mojave Desert. The plant will be one of the largest power plants in the U.S. and will be completed in three years. Coal plants take 10 years to build, Kennedy points out, and nuclear power plants can take as many 30 years. The solar plant costs $3 billion a gigawatt versus $15 billion for a nuke plant, one-fifth of the cost. Alternative energy sources like solar and wind are not only environmentally-friendly policies, but they’re also smarter economic choices too, Kennedy says.

“We can do it cheaper, we can do it more efficiently, but we need a national commitment to do that,” he says. “You’ve got China, you’ve got Germany, you’ve got the rest of the world who are looking forward, who are building these new technologies and we have the lead. We ought to be continuing that lead and selling them these new technologies not just lagging behind sitting on our hands and letting the Koch brothers dictate our national energy policy.”

Related: T. Boone Pickens: Biggest Deterrent To U.S. Energy Plan Is Koch Industries

According to Kennedy, Gov. Romney is merely paying lip-service to the importance of renewable energy. Romney’s primary focus, Kennedy says, is helping his friends in the traditional energy industry: oil companies, coal companies, and nuclear companies. These companies already benefit from massive and largely ignored government subsidies, and they create pollution that makes the cost of the energy they produce much higher than it initially seems.

Kennedy says he believes strongly in free-market capitalism. But he also observes that new industries often need government help, especially when they’re competing with unfathomably rich and powerful incumbents.

“Government has picked the winners,” he notes. “We give to the oil industry; we give $55 billion in direct subsidies each year. That’s more than all the renewables put together have ever gotten in history. If we stripped away the subsidies, coal could not compete in the marketplace. Oil could not compete in the marketplace and nuclear definitely could not. You know, you can burn prime rib to make energy, why are we going with the most expensive stuff?”

The environment is not the sole beneficiary of an alternative energy policy. Jobs in the wind and solar industries are high paying, plentiful and are restoring the U.S. manufacturing sector, Kennedy says.

“We’re employing more people in the wind industry than there are coal miners in America,” he points out. “Today there’s less than 14,000 miners in West Virginia and less than half of them are unionized. They have very little if any job security or pensions. The mountains of that state are being liquidated for cash, the communities are being destroyed and it’s the second poorest state in our country. There are two different models for industry and you have to ask yourself: what do we want for the American economy? Do we want to measure the economy by how many millionaires it produces or do we want to measure the economy by how, and this is how we ought to be measuring it, by how it produces jobs and the dignity of jobs over the long term for every American?”

America should make a big bet on renewables, says Kennedy. Doing so will not just reduce our dependence on Mid-East oil. It will also help build a major new industry that will create thousands of jobs, bolster American manufacturing, and help build a much more sustainable and healthy economy.

More from The Daily Ticker:

Chinese Company Sues Obama Over Wind Farm Shutdown

Romney’s Energy Plan Empowers States to Drill on Federal Lands


Project – SD1 – Thanks to Google Images.  Very Versatile.

Luke Stewart


Scientist, Bio-Tech, Energy Technology and sustainability. President, SDDC, Incorporated, San Diego, CA, Las Vegas, NV, New York, NY, Washington D.C. USA

USA · sd1.co, ENTAC

A New Key!

19 Feb

Very Cool resource.  Key to engineering in the 21st Century.2011

vol 8, issue 2

Engineering Grand Challenges

11 February 2013
By Aasha Bodhani, Jason Goodyer, Abi Grogan, James Hayes, Mark Venables, Vitali Vitaliev
Grand challenges graphic

The 14 grand challenges and progress to date

  • Grand challenges graphic
  • click to play video
  • click to play video

On 12-13 March the IET hosts a major international summit in London organised by the national engineering academies of the UK, US and China to discuss progress on 14 ‘grand challenges’ identified five years ago by America’s National Academy of Engineering. We look at what they are and how close the world is to solving them.

Make solar energy economical

If we are to move away from fossil-fuel-driven energy solutions then the burden will need to be taken up by renewable energy sources, and the most bountiful among these is, of course, the power of the sun.

Over the period 2000-11, solar PV was the fastest growing renewable power technology worldwide. Cumulative installed capacity of solar PV reached roughly 65GW at the end of 2011, up from only 1.5GW in 2000.

Concentrated solar power (CSP) is a re-emerging market. Roughly 350MW of commercial plants were built in California in the 1980s; activity started again in 2006 in the United States and Spain. At present, these two countries are the only ones with significant CSP capacity, with about 1GW and 500MW installed respectively, and more under construction or development.

According to International Energy Agency (IEA) analysis, under extreme assumptions solar energy could provide up to one-third of the world’s final energy demand after 2060.

There can be no doubt that, on paper at least, solar power is an attractive proposition. Its availability far exceeds any conceivable future energy demands. But exploiting the sun’s power is not without challenges.

Overcoming the barriers that could potentially slow down widespread solar power generation will require engineering innovations in several arenas – for capturing the sun’s energy, converting it to useful forms, and storing it for future use when the sun itself is obscured.

Many of the technologies to address these issues are already in hand. But it all comes down to cost.

Commercial solar cells, most often made from silicon, typically convert sunlight into electricity with an efficiency of less than 20’per cent, although some test cells do a little better. Given their manufacturing costs, modules of today’s cells incorporated in the power grid would produce electricity at a cost roughly six times higher than current prices.

To make solar economically competitive, engineers must find ways to improve the efficiency of the cells and to lower their manufacturing costs.

Further reading




Provide energy from fusion

Nuclear fusion is on the cusp of becoming a viable way to solve our impending energy crisis, but its actual application is currently still far on the horizon.

One major barrier preventing the achievement of worthwhile levels of fusion is the durability of the structural materials used to house a fusion reactor.

While energy-rich neutrons are responsible for the main source of energy extracted from a fusion reaction, they also convert atoms in the chamber wall and surrounding blanket into radioactive material. This further weakens materials that are already working to withstand temperatures of up to 100 million degrees, preventing the confinement of radioactivity and the easy disposal of nuclear waste.

The secondary engineering challenge is economically achieving fusion for a sustained period of time. Fusion was first achieved for a significant period by the Culham Centre for Fusion Energy in its JET (the Joint European Torus) facility in 1997, producing 16MW of power, a record that has yet to be broken almost 20 years on.

JET is currently the largest tokomak in existence, but is soon to be eclipsed by a successor project ITER (International Thermonuclear Experimental Reactor), a combined research project between the United States, the European Union, Japan, Russia, China, South Korea and India.

ITER is destined to become the first tokomak to provide a sustained pulse of energy, producing up to 500MW of power.

Main areas of research aim to address the instability of nuclear fusion, including the launch of the International Fusion Materials Irradiation Facility to research into potential new materials for use in fusion plants.

Inroads are already being made into improving the magnetic forces used to contain the fusion chemical ingredients as they combine and react. To unlock this practically unlimited supply of energy, considerable advances will need to be made into improving these superconducting magnets, advanced vacuum systems and structural materials, as well as developing robust robotic systems for the repair and maintenance of the reactors.

Further reading





Develop carbon sequestration methods

Increased carbon dioxide released into our atmosphere has a lot to answer for. Rising sea levels, increased storms and failed crops are just some of the side effects produced by its release, and it is predicted that one trillion tonnes of the stuff will need to be buried by various means of carbon capture and storage (CCS) before 2100.

Various commercial methods of capturing carbon dioxide are already in mainstream use, included in the process of dry ice manufacturing and the manufacture of carbonated beverages. This process could be adapted for carbon capture within coal-burning plants by replacing smokestacks with two absorption towers; the first would remove CO2 from remaining gases using absorption chemicals, while the second would separate the carbon dioxide from these absorption chemicals so they could be used again in the process.

To make this process more energy efficient, coal could also be burned in pure oxygen as opposed to the usual mix of normal air, eradicating the need to separate the carbon dioxide from the nitrogen.

The second step in CCS is the storage of carbon dioxide. Several potential environments have been identified by scientists and engineers as appropriate storage grounds for carbon dioxide, but none are currently foolproof sites.

Depleted oil and gas fields are an attractive prospect for storage as the carbon dioxide can be used to obtain remaining oil trapped deep in the rock sediment. Sedimentary brine formations 800m into the ground are also a viable option as the high pressure deep underground will help to keep the carbon dioxide in high density.

Unfortunately, both of these locations are prone to faults within the rock that could provide the carbon dioxide with the opportunity to leak out into the atmosphere, so engineers must design robust new systems to prevent this escape.

A third, more costly yet more reliable option, is to inject the carbon dioxide beneath the ocean floor. Although this process is more complex and therefore more expensive, the advantage is that carbon dioxide leakage is not an issue.

Further reading





Manage the nitrogen cycle

The nitrogen cycle is a natural process that takes place in four-fifths of the Earth’s atmosphere, and it is integral to a living organism’s healthy production of proteins and DNA. However, the type of nitrogen produced by humans, which has doubled since the industrial revolution, is fixed nitrogen, which is extremely difficult to break down into a useful resource outside of plant roots and lightening storms.

Innovative agricultural supply chains will need to be forged in order to maintain a sustainable food system. This reduces the overall effect of the nitrogen cycle on the environment, in turn reducing the rate of fixed-nitrogen produced, or improving the rate at which it is broken down into organic nitrogen. The engineering challenge is to come up with a viable way to produce organic nitrogen. Finding new ways to control nitrous oxide release into the air while fossil fuels are being burnt will also be a challenge for future engineers.

High-yielding crops, while producing more food from a single plant, rely heavily on rich fertiliser, which contributes significantly to the nitrogen cycle. If fertiliser were used more efficiently, for example by reducing run off and erosion of fertiliser during plant growth, then more than the current amount – only half – of the more difficult to break down fixed nitrogen would end up in harvested plants.

The next challenge is to create a viable way to prevent the leakage of fixed nitrogen in farming, which can occur anywhere in the supply chain from the field and feeding the animals, to the sewage plant destroying the waste. Improved waste recycling also holds the key to a marked reduction in the amount of fixed nitrogen released into the air.

Manure from cows and other livestock represents an ideal nutrient-rich fertiliser, but engineers must come up with a sustainable method of producing manure pellets to overcome issues such as logistics. The livestock also pose an environmental issue as they produce high levels of methane that is also damaging to the environment. Therefore new ways of reducing the release of greenhouse gases such as methane from such waste, and using these gases as a useful resource, must also be considered.

Further reading




Provide access to clean water

Access to clean water would seem to be one of the fundamental needs for modern life, but the water that many of us take for granted is still not available to everyone around the world. Many women and children, particularly in rural areas in developing countries, spend hours each day walking miles to collect water from unprotected sources such as open wells, muddy dugouts or streams.

In urban areas they collect it from the polluted waterways that surround the towns, or pay high prices to buy it from vendors who obtain it from dubious sources. The water is often dirty and unsafe, but they have no alternative.

Diarrhoeal diseases caused by unsafe water and poor sanitation, such as cholera, typhoid and dysentery, are common across the developing world – killing 4,000 children daily. People suffering from these diseases or caring for children who are suffering from them are often unable to work to earn money, yet face large medical bills.

Total global investments in water and sanitation would need to double for the Millennium Development Goal targets of halving the proportion of people living without water and sanitation by 2015 to be met.

But water for drinking and personal use is only a small part of society’s total water needs – household consumption usually accounts for less than 5 per cent of total water use. In addition to sanitation, a large proportion of the water we use is for agriculture and industry.

Technologies are being developed to improve recycling of wastewater and sewage treatment, for instance, so that water can be used for irrigation or industrial purposes.

A different technological approach to the water problem involves developing strategies for reducing water use. Agricultural irrigation consumes enormous quantities of water; in developing countries, irrigation often exceeds 80 per cent of total water use. Improved technologies to more efficiently provide crops with water, such as drip irrigation, can substantially reduce agricultural water demand. Water loss in urban supply systems is also a significant problem.

Further reading





Advance health informatics

Concerned with the acquisition, management, analysis and use of medical information, health informatics is a far-reaching field taking in everything from personal medical records to data concerning diseases.

The cost of disk space has been falling over the last 30 years, leaving many industries grappling with information overload, leading to the emergence of ‘big data’. Health informatics is no different. As the amount of data grows, software must offer clinicians access to information relevant to each patient as well as access to archival medical research material, and a decision support system, while remaining mindful of the pitfalls such as breach of patient confidentiality and the misuse of data by medical insurers or employers.

Another problem lies with bringing the old, largely paper-based, system of record keeping up to date with a new computerised system. A task easier said than done considering many of the programmes used to store data are incompatible, sometimes even those within the same hospital. Future systems must be engineered to facilitate the sharing of data across all of the different systems in use in the various departments to create a fully integrated regional, national and global health informatics network.

Methods of data collection are, similarly, in a state of great change. Soon wearable devices that monitor pulse, temperature or other important measurements could be embedded within clothing or even the body. These sensors could contain transmitters and receivers to monitor a patient’s state, whether in a hospital or at home, and alert medical staff if complications or problems arise, or even tell them to administer drugs.

On a larger scale, the power of health informatics could be harnessed to combat the outbreak and spread of disease. A viral threat such as Avian flu H5N1 could spark a global pandemic. Early warning systems that monitor data on hospital visits and orders for drugs or lab tests are already in place in some countries but more sophisticated methods are required. New strategies for producing vaccines in large quantities must also be devised, perhaps using faster cell culture methods.

Further reading





Engineer better medicines

The sequencing of the human genome in 2003 represented a true paradigm shift in biology and medicine, the effect of which is likely to be felt for many years.

Human DNA contains more than 20,000 genes, which are largely the same in all humans. A small number, less than 1 per cent, differ from one individual to the next giving us our own identity, personality and appearance. These differences also give rise to unique elements of brain and body chemistry and can predispose people to certain illnesses or alter the way in which they respond to medications. Knowledge of a person’s unique genetic make-up, therefore, can be used to tailor drugs to meet an individual’s unique needs potentially leading to a new era of personalised medicine.

Standing in the way of this, however, are several challenges: collecting and managing the huge amounts of data required; developing better systems to assess a patient’s genetic profile; and creating inexpensive diagnostic devices that can detect minute amounts of chemicals in the blood.

Currently medication is often prescribed incorrectly leading to the development of resistance to drugs without any associated benefit. The production of faster, more effective diagnostic methods would help in the prompt screening of larger numbers of drugs leading to the more efficient application of the appropriate treatment and lower this effect.

Traditionally antibiotics are chosen that attack a wide range of bacteria as clinicians cannot always be sure which bacteria are actually causing a given problem. Analytical methods that pinpoint the exact nature of an infection could facilitate the use of more narrowly targeted drugs reducing the risk of the bacteria developing resistance. Certain viruses too may be combatted by engineering small molecules to attack their RNA and preventing them from reproducing.

Elsewhere, advances in the field of synthetic biology mean it may soon be possible to regenerate tissue or organs or even grow replacements from scratch. Researchers in nanotechnology may soon be able to design systems that are hosted by the body and release, say, insulin when the blood glucose levels are high.

Further reading






Restore and improve urban infrastructure

Infrastructure is the lifeblood of any city, delivering water and power, removing waste and allowing efficient movements of its inhabitants and goods. But many of our established cities are serviced by ageing infrastructure, much of which can trace its roots back to the Victorian age. Keeping all these assets in excellent working order is not a new test, but it is a growing challenge.

Vast amounts of the existing infrastructure are buried, posing several problems for maintaining and upgrading it. One major challenge will be to devise methods for mapping and labelling buried infrastructure, both to assist in improving it and to help avoid damaging it.

A project of this sort is now underway in the UK, with the aim of developing ways to locate buried pipes using electromagnetic signals from above the ground. The idea is to find metallic structures capable of reflecting electromagnetic waves through soil, much as a reflector makes a bicycle easier to see at night.

Other major infrastructure issues involve transportation. Streets and highways will remain critical transportation conduits, so their maintenance and improvement will also remain an important challenge. But the greater challenge will be engineering integrated transportation systems, making individual vehicle travel, mass transit, bicycling, and walking all as easy and efficient as possible.

While such services can help support growing urban populations, they must be accompanied by affordable and pleasant places for people to live. Engineers must be engaged in the architectural issues involved in providing environmentally friendly, energy-efficient buildings both for housing and for business.

But in this constrained financial age, funding major projects is not easy. Numerous policies and political barriers must be overcome. And so, a major grand challenge for infrastructure engineering will be not only to devise new approaches and methods, but to communicate their value and worthiness to society at large.

Further reading





Reverse engineer the brain

On 11 May 1997 IBM’s purpose-built chess computer Deep Blue defeated its human opponent, world champion Garry Kasparov, two games to one with three draws. It was a huge, symbolic victory for artificial intelligence, and a future populated with robotic waiters and Terminator-style supersoldiers seemed only a few years away.

The reality, of course, proved different. But some researchers are now taking on the challenge of creating artificial intelligence from a different angle: by starting with study of the human brain and working backwards.

The ever-increasing processor power driven by Moore’s Law could see computer simulations of the brain, and other things, getting more detailed and more accurate. By studying how the brain itself learns, researchers may be able to design computer processors that can handle multiple streams of information at the same time rather than the one at a time approach currently employed. But progress is hindered by the brain’s innate complexity. Each nerve cell in the brain receives impulses from tens of thousands of others, tracing the path of any given signal is extremely difficult.

However, there are already examples of artificial intelligence benefitting from reverse engineering of the brain. Researchers are looking into patients with a damaged hippocampus, the area of the brain responsible for memory, and learning, which can lead to the electric signals between nerves cells that is required for forming and recalling memories. Engineers have begun designing chips that mimic the brain’s communication system.

Researchers at Duke University Medical Centre in the US have taught rhesus monkeys to control a robotic arm using only signals from their brains and visual feedback from a screen. In the future the same technology may be applied to improve the neuroprosthetic limbs for use by people who have been paralysed.

Researchers say that the technology they have developed could also improve rehabilitation of those with brain and spinal cord damage due to strokes, diseases or trauma.

Further reading





Secure cyberspace

The challenge of securing cyberspace is shifting from the provisioning of methodologies for safeguarding key organisational and personal assets (valuable and sensitive data, say) against cyber criminals, hackers, hacktivists, enemy agents, and other online threats. It now aspires to a baseline discipline governing the specification and design of all computerised systems that must henceforth be protected.

In an increasingly connected world in which more physical devices are Web-ready – therefore targets for cyber threats – securing cyber space often means making it safer for devices that rarely interact with humans, but that can have a direct impact on their well-being and safety.

Security expert Corey Nachreiner of WatchGuard suggests that there is now a heightened likelihood that the next 12 months will see the first cyber-attack that results in a human death. The accelerated proliferation of both networked devices and online threats will create a ‘perfect storm’ of vulnerable connected systems that, if targeted, could increase the chances of a ‘fatal malfunction’ – and the first human death as the result of a cyber-attack.

Networked road vehicles, Internet-ready medical devices, and intelligent buildings, are among the emerging connected physical domains that may start to be hit by the end of 2013. These devices often form part of a nation’s critical national infrastructure, and cyber protection at this scale reminds us of the necessity to encourage a successive generation of cyber security careerists equal to the task.

A further challenge lies in the advancement of defensive measures that proactively counter malicious cyber activity, so that human (and financial) resources can be deployed where they are most needed. Seculert, for example, develops threat detection tools that use Big Data analytics to inform enterprise cyber security strategy.

On a broader scale another cyberspace security challenge is the development of techniques by which collective intelligence about hackers, virus/malware propagators, and other ‘threatscape’ actors, can be used predictively to forewarn prospective victims about an impending strike.

Further reading







Prevent nuclear terror

In the 67 years that have followed the dropping of Little Boy on Hiroshima, the necessary components for the construction of an atomic weapon have been accumulating across the globe. Since then eight countries, the US, the Russian Federation, the UK, France, China, India, Pakistan and North Korea, have successfully detonated nuclear weapons. But with many countries using nuclear material as a source of power and some warheads potentially not secure from theft or sale, the threat of nuclear attack by rogue nations or highly organised terrorist organisations is ever present.

The challenges this state of affairs presents for engineers are manifold. Most simply an information system to keep track of all nuclear weapons and material is needed. Secondly, the global community has to be able to ensure that a nation using nuclear material in power plants is not extracting plutonium for use in weapons in contravention of international law.

One possible solution is the development of a passive monitoring device near to a reactor that would transmit real-time data about the contents of the reactor. A further problem of detection is the ease with which freight can be shipped around the world. Some ten million shipping containers enter the US each year with each one capable of holding as much as 30 tonnes. Finding a few kilograms of weapons grade material among all of the garden furniture, children’s toys and other miscellanea making its way from A to B is like finding the proverbial needle in a haystack.

One solution, which has been nicknamed the nuclear car wash, involves scanning the containers as they go along a conveyer belt. As they pass through the scanner, they receive pulses of neutrons, a subatomic particle used to induce nuclear reactions. The neutrons would induce fission in any weapons-grade nuclear materials within the container which in turn would produce radioactive substances that would emit gamma rays that could be detected by the scanner.

Other problems which need addressing in the coming years include rendering potential devices harmless and also how to go about cleaning up should a nuclear attack take place.

Further reading







Advance personal learning

Rigid exam systems can limit scope for personalised learning, but at least courses can rise to the challenge of exploiting digital technology to adapt to the individual needs of students so that each can progress in their own way.

Adaptive learning has emerged to cater for this, combining Web technologies with interactive methods to tailor courses to the student. This has gone beyond the basic adaptive testing methods where each question gets harder until the student fails, or the single point online teaching process presenting material in response to the pupil’s recent on-screen activities.

A number of companies, such as New York-based Newton, have developed platforms that respond in real-time to the activity of each user on the system, and adjust to provide the most relevant content according to more complex analysis of test scores, speed, accuracy, delays, keystrokes, click-streams and drop-offs.

It goes further than adaptive testing, which assesses the student’s current state of knowledge, by then assessing what material or activities would help that student best progress further. There are various algorithms used for adaptive learning, but they all analyse a student’s performance according to multiple data points, drilling down into different concepts to recommend follow up actions – in other words taking the role of a personal human tutor.

A related development is the digital textbook – an attempt to get away from the rigid book form, which is often out of date by the time it is published. One of the most advanced national digital textbook programs started in South Korea in 2007, now being tested in primary schools with plans to start free nationwide distribution this year. Digital textbooks will become essential materials for adaptive learning, capable of incorporating a core curriculum component; but marrying this with individual students as they interact with it online, so that each person has the perception that the book has been written just for them. The digital textbooks will combine text, reference material, and dictionaries with multimedia contents such as video clips, animations, and virtual reality, linked to individual student workbooks.

Further reading





Enhance virtual reality

Virtual reality (VR) has obvious application in training for military personnel, surgeons, and airline pilots in safety-critical fields where it is too dangerous to let novices loose on the real thing, and this is where the VR technology largely established itself.

As 3D simulation capability advanced and became allied with sensory and mechanical feedback techniques, VR has expanded into other fields such as medical treatment and industrial design. There has been spectacular progress in addressing the challenge of treating phobias and stress disorders, as in a US trial conducted in 2011 on victims of PTSD (Post Traumatic Stress Disorder) caused by incidents while serving in the Afghanistan conflict. VR simulation of the incidents was combined with physiological monitoring and training in methods to come to terms with the disorder. About 70 per cent of participants who received the VR therapy showed a clinically significant improvement in 10 weeks, compared with 12.5 per cent in conventional treatment involving psychotherapy alone. The VR method was a form of ‘stress inoculation’ combining training with controlled exposure in an environment the patient knows to be safe.

VR is being applied to industrial design at the UK’s Virtual Design Enterprise Centre at Wolverhampton University. In this context VR is an evolution of computer-aided design, where the aim was to visualise a product, and so avoid some obvious mistakes before incurring the higher costs and time lags in creating a model or prototype.

VR extends this to simulation of product characteristics such as aerodynamics or friction, as well as giving a greater sense of the final product to enable aesthetic or design aspects to be enhanced.

Another emerging application for VR to master is in surgery, both for training and giving patients options in life-like detail. Traditionally, surgeons have trained on cadavers, dummies, or animals; but these are not ideal for advanced endoscopic procedures, where VR can simulate the movements much more accurately. The UK’s Golden Jubilee National Hospital near Glasgow recently set-up a virtual 3D surgical training program along these lines for medical students.

Further reading





Engineering the tools of scientific discovery

The astronomical telescope, invented in the 17th century, was one of the great early examples of precision engineering leading directly to great scientific discoveries, in this case through high resolution optics.

The development of the Hubble Space telescope took this co-operation between the two disciplines to a new level, as there was little scope for repairing or upgrading the equipment in space. But the telescope did develop a fault that would have compromised the rest of its operational life, and in 2002 Dr Edward Cheung, then principal engineer for the challenging Hubble Space Telescope Development Project, oversaw a technically-difficult service mission to install a repair component that he named Aruba (after the Caribbean island where he grew up).

Dr Cheung was recently awarded Knight of the Royal Order of the Netherlands Lion in recognition of his engineering achievements at Hubble, while on the scientific side, astronomer Adam Riess shared the 2011 Nobel Prize in Physics for his discovery of a new kind of supernovae – an exploding star – via the Hubble telescope.

Data from the Hubble project continues to raise new challenges. On the biomedical front, there are few better examples of the symbiotic relationship between engineering and science than the Institute of Biomedical Engineering (IBME) at Imperial College in London. This was set up by a ’10m donation in 2004 to stimulate medical diagnosis and treatment by bringing together engineers, life science researchers, and medical practitioners. It has made many original contributions to biomedical research, recognised by winning the 2009 Times Higher Education award for outstanding contribution to Innovation and Technology.

Since then the institute has continued to focus on three domains – early detection, diagnosis and real time monitoring of health conditions – providing a framework for the emerging era of personalised medicine. This will depend on accurate and often continuous monitoring of health conditions to match therapies to the specific genetic profile and current metabolic condition of the patient. The institute has developed a number of novel methods for such continuous sensing.

Further reading