__ NASA is honored to celebrate the 25th anniversary of The Hubble Space Telescope by awarding this award to the Hubble science team at the Smithsonian National Air and Space Museum for its continuing scientific contributions to our knowledge of the Universe. To celebrate, we’ve teamed up with The National Geographic Society and The Hubble Foundation to create a special edition Hubble Monthly magazine in celebration of the Hubble’s past accomplishments, and provide free shipping in the U.S., Canada and Mexico of any issue for the remainder of the year on subscription. We’re sure you’ll agree this will be a great celebration of one of our nation’s greatest scientific achievements! You can get all 26 issues of The Hubble Monthly here at the Smithsonian, including 10 issues featuring the Hubble’s Space Shuttle flight in 2001 and the historic Hubble Space Telescope’s first international observation missions of the Great Nebula in 2001 (2) and 2002 (4). __ NASA continues to be the world leader in science. So much so that it is the only agency in the United States, not to mention the world, with a “green” badge. NASA is an organization of more than 6,000 people at every level, all working together to ensure our ability to take human exploration of our Solar System and beyond. NASA provides critical science and engineering support to U.S. government missions by advancing technological innovations in space; by building partnerships between government and industry to support exploration; by assisting in NASA’s research and development programs; by protecting and advancing the safety and health of scientific instruments for use in the Nation’s space program; and by working toward national leadership in civil space technology, education, and international human spaceflight. __ NASA’s Office of Commercial Space Transportation was formed under the leadership of former NASA administrator Jim Green in 2005. NASA Commercial Space Transportation Program provides services for national security, business development and the general public through programs, projects and technologies that have the potential for commercial use of space. With the exception of some commercial missions to the International Space Station, commercial applications are limited only to those that do not require space transportation systems that have been proven or are technologically viable, or those that can be built and operated at prices that reduce mission risks. To learn more about NASA Commercial Space Transportation Program, you can visit: __

Our first month-by-month newsletter, which is a one-stop shop for Space Month events both for attendees as well as professional photographers, with a few helpful photos to help you put this event in perspective!

Please subscribe to receive the latest stories, photos, videos and event updates via email. Subscribe now

Learn how to prepare your own images for broadcast to NASA, government, space agencies and the general public.

This article examines one such change that’s happened to Pine Island Glacier in the last five years and, as a result, we’ve been adding another 5,000 metric tons to its current loss. Pine Island Glacier’s mass has slowed down more spectacularly than it’s ever done so to get to that new low, it must shed enough ice into the ocean to compensate for the loss of surface ice, which is more than twice the normal level. And so to see how a few hundred thousand metric tons can slow Pine Island Glacier, we needed to study a small patch of water that’s part of its shrinking, ice-free territory. I’ve already noted this small patch:

From the satellite imagery above, you can now see that there’s a thick line running along half a mile of the northwest corner above Pine Island Glacier. This is the same line that’s been observed in the satellite image above. Its bottom edge is running through the center of the image. With a thinner ice pack than it was five years ago, Pine Island must cut this line to release the surface ice at the foot of Pine Island Glacier. The area was already losing as much ice as the island had previously contained with the loss of almost all of the mountain top ice. But that doesn’t mean the shoreline is at risk. As a glacier, the Pine Island Glacier is always getting more active. As long as there are surface and glacier ice masses above the ice below it, the glacier won’t run out of energy to break down the smaller snow and ice masses before it reaches the edge and breaks the ice to release itself. So when the end is near, Pine Island will slowly lose ice at a rate of about 40,000 metric tons a year. In other words, it’s the same number of lost tons as the island has already lost in the last five years alone! With a pace that slow, you would assume that most of the island would already be gone for good. Instead, its shrinking and its accelerating suggest that the bottom end is still relatively undamaged. To understand how this is happening, it’s important to look at what’s at ground level. To find out the Pine islands of the Antarctic Ice Sheet (AIS) is at this point in the future, you simply take the average of the height from the high and low points. This line is going to be a bit over three thousand feet above sea level, so most of the area has been lost below sea level. Below surface, the ice-free area may increase slightly in size in the months immediately following the end of the summer solstice. When we consider the average height at Pine Island Glacier from 2009 to 2014, we’ll find that our ice-free area could grow by almost a mile. In that same stretch of time, the Pine Islands of Antarctica are gaining ice, an area we can see by examining the satellite imagery. Since 2010, we’ve seen the Pine Island Glacier’s ice-free zone increase dramatically in thickness. The two lines in the map above are the base of the area where the two types of ice masses could be found. The lines overlap on both sides of the point where the Pine islands used to meet, though the Pine Islands that could still be reached on either side of the sea ice are a bit harder to see. When they were here, the base of the ice-free area was about a mile higher than it is today. Now that they’re gone, the base is over three times that height. Since the land-based, island-free area is gaining ice, the Pine Island Glacier’s ice-free zone now runs for eight miles. The amount of ice gained is so large that it can’t be explained by the large amounts that have been added to Pine Island. One thing we can tell you is that the ice on the continent is becoming more and more unstable. According to the IPCC, there are more than 10,000 glaciers on the Antarctic Plateau at any one time. The ones that can actually feed us all are probably in the middle of the table on the right. The rest of the continent is in serious trouble and is probably growing more rapidly than the rest of the earth–the entire continent is going to be losing ice at an ever increasing rate within the next ten years and the Pine Island Glacier is one of the few of those glaciers. And so because of this, it’s almost certain that Pine Island will become thin enough within only five years for the ice to melt away completely.

Pine Island Glacier and Sea Ice At 1,500+ Feet Down, Pine Island Sea Ice Is Shrinking Fast

The Pine island AIS has been shrinking steadily in the last decade. This is an area around which the remaining ice of the continent is expanding, and the glacier loses water

This is achieved by using a low temperature method to ionize the carbon in a gas phase. The reaction at room temperature also yields an extremely compact and compact graphene and even stronger optical properties which greatly reduces fabrication costs.

In the above images, the electron beam is shown to pass through the gas phase to an electronic cell. This is accomplished by a thin-film deposition process. Typically, an extremely thin layer of high purity graphite on a substrate is deposited. The sheet is then subjected to a beam to deposit and bond the graphitic particles. By depositing the deposited carbon on the substrate, the resulting graphene is then heated by a plasma heating process to produce a graphene sheet in less than 5 seconds. The high temperature, coupled with an excellent plasma heating technology yields a sheet that is 10 to 100 times stronger than any printed graphene. In its highest temperature, the carbon will reach 0.2 Pa and the graphite will reach 10.5 times stronger at 0.3 Pa. Graphene is a very expensive medium to manufacture that will have to be scaled down in several steps.

The next step in the production of micro-graphene is to add to, or improve the performance of existing processes and to scale them up to high throughput. This is made simple by the fact that an easily scaleable system is available, the microwave irradiation of the carbon. For some time now, conventional microwave irradiation has been in widespread use to achieve large and rapid graphene production. This microwave heating of carbon (MTHC) is an outstanding method to produce high quality graphene. The MTHC produced is almost uniform and is made up of the typical graphite properties of anhydrous carbon. However, this method suffers from a variety of drawbacks, starting with the fact that because graphite is a very inert carbon the product will have a very weak surface effect. The surface effect will weaken any subsequent deposition of graphene, making it uninsurable.

Since graphite is the dominant precursor, the only way to avoid this problem is to obtain graphite from graphite ore. By irradiated graphite directly to produce high quality graphene, the surface effect of graphite is effectively eliminated. Since graphite has a similar chemical structure to carbon it also helps in producing better and larger graphene sheets. This can produce even better graphene by adding, or improving on the performance of the existing and currently available electron beam electron laser production of carbon flakes to produce a more uniform and strong graphene sheet in the short run.

(I actually have a pretty large map made of the image here ). At the start of the 21st Century the number of meltwater lakes on West Antarctica was not significantly higher than on the East Antarctic Peninsula and South Georgiais indeedalreadycovered by the “sea ice”. But the current volume of ice has risen steadily over decades and a massive influx of warmer water into this zone has brought about the largest warming of water on land.

Figure 5. (NASA) The source of this massive watersurge is still unknown. Although scientists can only speculate on the cause we can also conclude that it is a natural cycle and has been going on since the early days of the last ice age. Scientists at NASA and the University of East Anglia have set out to quantify the potential for this warm freshwater to flow into Antarctica or the surrounding ocean and, more importantly, to try to predict the future of this ice as it grows, growsand grows.

Figure 6. (NASA - New Scientist News Release) To assess the potential for the warm freshwater flooding this region NASA created a computer simulation study, called the “LANCEatmosphericfluid model.” This computer model is an attemptto account for the flow of these waves of warm water through ice sheets that is caused by the heat of the ocean and the meltwater lakes located at the bottom of the ice sheets. These waves of warm surface water can be seen flowing through the glaciers like a river before melting into water once it finds its way from the interior into the seas. It can also be observed that this new study is showing the potential for the wave of warm water to increase in size very quickly as the ice grows.

The researchers at NASA are now looking for clues and research in the field of oceanography that show the potential for the wave of warmer water to flow into the Southern Ocean or its surroundings. They are also going to look at that wave of warm water and try and predict the wave of warm water it will likely grow into over decades and will eventually contribute to the rise in Antarctic sea ice.

“This new study is a first step in helping to understand what mechanisms are responsible for the mass flow of warm water ice through West Antarctic glaciers and how it may impact other large, rapidly-melting areas of land,” says Dr. Michael Beggs, lead author of the study from the University of East Anglia. “It offers a window onto the potential for an ongoing wave of retreat as the planet warms and the sea ice expands. We would expect to see similar movements of sea ice in both the summer and winter, but the timing of the retreat of the Greenland ice sheet or the retreat of Antarctica’s West Antarctic ice sheet could make it difficult to predict how the flow of the waves will move. This is a huge unknown, but for future planning of polar planning for the future this will be useful.”

“If you’ve been following me on twitter for the last couple of months you may have seen that I’m pretty new to the internet. I’ve been working on a number of weird and incredible projects. It took me a long time to figure out that I needed to research how the sealevel was measured up in order to find a better way to model it, and now I believe that I’ve finally figured out a way to do it. This should be a very exciting next couple of papers to come out of my research. If you could follow me on twitter and keep up with my work it’s amazing.”

It’s actually amazing that this research is even happening, but the implications are even greater than the previous studies. To say the least there are quite potentially important implications from this new research. If this kind of warm water is able to flow through the ice it can easily have much more profound consequences on the global climate. If these warm waters do invade the continent of Antarctica there could be serious consequences to our own oceans too. The ocean currents and pressure systems may change in an unpredictable manner and this means some areas of the world, like South Africa, may no longer be able to sustain human agriculture. If this flood wave, with such a big wake is able to hit the Southern Ocean it is estimated that its “troughs” could be 20 to 40 times larger than those on the Antarctic Peninsula. These huge waves would be able to disrupt the current of the Pacific Current which is normally acting between North and South America. This may cause the entire Indian Ocean to turn into a new tropical storm (which may actually be what we have been anticipating coming to our shores) and if this happens could lead to strong hurricane force winds across the continent of Africa that would be very bad for a large portion of Africa.

I still don’t know why people continue to doubt (although I guess they shouldn’t) that man

With the help of the Mars Science Laboratory (MSL), a robotic platform in orbit around the red planet, the group has completed the necessary research. But it is the final work on planning a human expedition to Mars that is currently under way. The results of the last 20 years of research have led the Japanese to believe that human colonization of the red planet is both economically and logistically feasible, and in the end the next step that will determine Mars’s future will be the landing of astronauts on the surface to visit the surface and begin to gather first-hand information on life that exists there. And, yes, there is hope that we may one day colonize the planets of our solar system.

A rocket carrying a Japanese Space Agency spacecraft to Mars is set for launch in 2010. MANDEL NGAN/Getty Images

To understand what will be required to accomplish this mission will require a bit of context. A human expedition to Mars poses some unique challenges, and there is no shortage of other space programs or countries that are in the process of undertaking similar mission plans. To begin with, a human expedition to Mars will involve four separate mission elements. First, a mission to Mars would necessitate a manned launch in order to obtain a Martian surface habitat in an early stage of its development. This will be accomplished with the SpaceX manned space shuttle mission. Though it is not feasible to conduct a single launch on a single rocket, the first landing attempt will likely take place in conjunction with the Russian Space Agency’s Kurscom-2 mission. Finally, two major operations, launch and surface operation, will begin in early 2011.

NASA’s Orion vehicle with the Space Launch System. NASA/Doug Linder The first three of these operations are what are commonly referred to as “bases.” From the surface, astronauts will likely land an automated vehicle known as an Earth-facing habitat module on what the Japanese refer to as a Martian lander. After a successful landing, the module will remain on the surface while crew members continue to perform science experiments. The Martian lander would then proceed to begin “exploration of the Martian system.” After a second landing attempt, the module would become a permanent home for astronauts. The surface operation will then follow a similar path as the first two launches, with a large section of the surface covered with a surface habitat module designed for humans. Though no concrete plan is currently available, most have speculated that one might be an unmanned test vehicle for a lander landing attempt. This could conceivably result in the use of a new kind of rocket, or the adoption of a more conventional launch vehicle that can accommodate such a test. After many years of development, the JAXA’s latest design for the Mars surface habitat is the Space Act Aerospace Technology. A portion of the design utilizes three different rocket components to allow for a single large rocket to be released from a central base, and that one small rocket launch from a secondary base in the southern hemisphere, where it will act as a small satellite for the surface habitat module.

The Japan Aerospace Exploration Agency plans to send a Space Act Aerospace Technology Mars sample-return spacecraft to the moon in 2014. A third element of the rover may be an unmanned test vehicle for a surface sample collection mission. If successful, it will mark the first attempt at a sample return mission to the moon by a Japanese rocket. The lunar surface is almost certainly not the perfect environment to gather samples from ice, which is why the team is working extra hard to understand the possibility of Mars atmosphere.

……………………………….. ……………………………….. ……………………………….. ………………………………..

Biological nanotechnology uses single molecules as sensors to carry out tasks as diverse as self-assembly, imaging, diagnosis, and drug delivery. Biophotonics is a new branch of nanotechnology in which single organic molecules, such as DNA, and proteins are used as sensors that carry out biological functions ……………………………….. ……………………………….. ……………………………….. ……………………………….. ………………. ……………………………….. ……………………………….. ……………………………….. ……………………………….. ……………………………….. ……………………………….. ……..

New technique produces nanosheets a billion times thinner than paper in just 9 days

At a special molecular nanotechnology workshop, the team unveiled a new technique that promises to yield nanosheets a billion times thinner than paper in just nine days. While the previous record was 13 days, the current one marks 2.5 days. The team, including the University of Washington School of Chemistry, used a new technique to fabricate nanosheets at the atomic scale. The new technique, which was completed in collaboration with T-Cell Diagnostics of Palo Alto, CA, promises that the process could be scaled down for manufacturing nanoscale materials. …………………………….. ……..

Harmonistic and functional research in the realm of living matter helps accelerate the drive to nanomanufacturing

Nanoscale technologies like these transform the way we interact with other living things, including humans. The study demonstrated that a technique using a chemical and electrical circuit can be used in the lab to model and control the behavior of living cells called epithelial cells. This new technique has the potential to be used in medical devices, consumer products and other applications. This work could open up new methods of developing nanomedicine in the health research lab. The work was presented at an international conference, the International Society for Materials Chemistry and Nanoengineering, in Japan and published in the American Chemical Society Proceedings. “This recent study shows how the use of chemical and electrical circuits coupled with molecular engineering has the potential to bridge the gap between biology and the lab,” said Paul A. Mascart, a professor of chemistry and biochemistry at the University of Iowa, Iowa City. “A very important point to note is that this demonstration has no obvious limits to the number and complexity of cells or material studied. As we continue to develop nano-scale materials, it will become increasingly important to work with cells within specific biology domains.” …………………………….. ……..

But they also include many other things that are not included in an “unofficial” chart:

  • As part of the “Duke Energy Challenge”, we had to create the “Energy Star” logo and the “Energy Star: A Better Life” program. Some of these things are included in the official Duke Energy Challenge calendar ( click the link ) and some of them are not. We hope to add this list to a new unofficial Duke Energy Challenge page.

  • The Duke Energy Challenge logo, the “Pump Up The Volume” advertising campaign and the brand name are trademarks of Duke Energy. They are owned by Duke Energy, LLC.

  • The U.S. Government holds trademarks to a variety of Duke Energy products and services which may show products not approved for sale by the U.S. Government. In all events, Duke Energy is not responsible for or liable for these products or offerings. Please be aware of the potential for confusion, legal risks or any other factor which could result in a customer purchasing either goods or services which are not approved or certified by the government, its agencies or agencies thereof.

A reduction in methane emissions on Earth - from the burning of fossil fuels, forest fires, and wastewater treatment plants - is needed to slow the global rate of warming.

Here’s a map of CO 2 emissions over time for the world’s major cities and the total GHG emissions in each city.

Here’s a very detailed breakdown of city GHG emissions, and the corresponding CO 2 footprints .

One could argue that global temperatures would be predicted to increase if all the GHG emissions of global cities were removed. Most of the GHG emissions that occur in cities fall inside these cities.

For instance, in a very dense city (with very few or no cars) it’s relatively easy to drive to work. In a city of a million inhabitants, there are many more people driving to and from work and to the grocery store, and many more of this “driving to and from work” falls outside the cities. So, if driving to work and walking to and from the store are both “work” - as they are currently on Planet Earth - it’s actually quite difficult for people in “dense cities” to avoid walking (and driving to and from work). However, the map below shows that global emissions of CO 2 are largely due to land-use change. So eliminating global emissions could only affect the climate in which a person lives, and even then global warming wouldn’t stop.

Here is an abstract of the paper(PDF) by the scientists Michael Mann and Naomi Oreskes .

The idea was to test each of the eight “nanometer” bandsthe next steps up from just a single-atom-wide bandin an experiment intended to find when a new type of material is formed, just as when a new species is created from a previously existing organism. For the most successful of the experiments, the researchers set their experiments in the laboratory at Los Alamos National Laboratory in New Mexico for 24 years, from 1981 to 1991, and measured at least seven bands in the process-only eight.

The results were that all of the “nanoperes” formed only after about three nanometers of heating up. The group of scientists decided to pursue these “nanomeres” further to figure out the mechanism-then in the laboratory they started to work with a different type of material. First, they wanted to investigate a second type of “nanomeres,” one formed in very high concentrations of an unknown material. They figured it out in three steps. Then, following this process they tried the same process but in a much smaller containera single nanometer cube, containing an element that behaves in exactly the same way as plutonium. This was easier to do than to conduct trials on the eight nanometer bands.

One can only hope that the next “nanomeres” will be similar to the first one or the first “nanomeres.” Or that they will be the last one.

Explore further: The mysteries of black holes continue to be unraveled with new experiment

(As a reminder, this is in no way an endorsement of the current weather patterns.)

A map of the snow cover in the United States (and in Antarctica) as seen from the SOHO/IACM Global Elevation Model . See how the snow map is showing a lot of areas where our mountains (the snow melts) are shrinking on the left, whereas the south continues to see snow cover growth? Snow cover is disappearing entirely or it’s nearly on the way to extinction. The only thing “natural” is our weather patterns. We’ve been so manipulated we’re almost certain to see that kind of weather in the US again in the very near future.

From here, let me describe a particular part of the snowpack loss process. The US Snowpack is basically a snow reservoir, which is essentially a stable water level at high temperatures. The snowpack usually stays near that level all year long. However, this season’s “snow event,” the worst in over a 100 years, created a lake that got smaller and smaller until it crashed at 65% of its size. Just when everything was turning (or “turning back” as it were) to “wet weather”, a huge spike in snow cover occurred. The water level rose and the snow became frozen solid and not flowable by the ice. It then froze completely in a matter of a few weeks. We saw snow fall across almost all the country in the days after this. This is no fluke. According to the National Snow and Ice Data Center , this area will be at record low snow cover for the winter of 2014/2015 , with the snow loss so extreme it could end up being the worst ever for the winter of 2014-2015.

This could actually be worse than the historic drought!

This lake on Eagle Island has grown from 12 percent to 70% of its normal size and was up to 80% the day before the snow event. This is the kind of climate change we’re already seeing. More snow is falling across much of the land on average in March than had been seen since 1950. The recent ice melt is also causing the snow to recede sooner and so the total number of days that a normal (or even above normal) temperature day produces a full snow cover, the average maximum extent of a given month, has gone back down. The new record breaking minimum temperature (19.5 degrees Fahrenheit) is also causing the first snow cover receding of the winter season, and the snow cover not just disappearing but not growing at all. When it becomes extremely dry, we often see the ice melt and the snow not reach 20 to 80% of normal, as has been seen in the last few years, and the lake will start to show little if any snow rise in the southern parts of the US in the summer. Then it starts to shrink. It will look all frozen and the lake is just going to go out of date. The ice melting isn’t just a climate change phenomenon, it was one of the main driving factors contributing to the record snow extent that was expected this season.

I am really shocked at what is happening to this snow cover. At these temperatures above freezing water is melting very fast . In fact, in a study from New Zealand, they showed that freezing of water in a pot of ice causes water to go from + 2 to -5 to -16 at very subfreezing temperatures. They compared this to water in a pot of boiling water. Even in an ice age the boiling water evaporated before any ice formed, in fact the boiling pot of water only heated up to boiling temperature and then cooled and got frozen again. The subfreezing temperatures do not do this. It does not seem to be possible. It turns out that the subfreezing water (i.e., water that was warmer than the surrounding environment), is now melting faster than the surrounding water, essentially evaporating. It also is melting faster than it is being added to the surrounding water, just like the water you add water is being added to a water table until all the water is added in. The subfreezing water evaporates first and therefore is most slowly being added, much as when you are adding water to one pot to another. So, there you have it. We’re in very real trouble as our climate continues to change under the influence of human activity. From a scientific perspective this is truly frightening. There is nothing natural about this, and it takes us directly back to our pre-industrial age. Our snow melt this winter could be the worst snow season of the past 300 years, at least within the US. What happened here?

There has

Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now