Recent Posts
- Peace Through Water Desalination
- CPAC Water Policy Interview with KLRN Radio San Antonio Texas
- CPAC Water Interview With California Talk Show Host Rick Trader
- Toward a Green Earth Policy in the era of Trump
- Gates Foundation Water Energy Vision
Recent Comments
- LLNL Researchers use carbon nanotubes for molecular transport on
- Greenhouses for Desalination on
- American Membrane Technology Association on
- Engineers develop revolutionary nanotech water desalination membrane on
- LLNL Researchers use carbon nanotubes for molecular transport on
Archives
- May 2017
- March 2017
- June 2011
- December 2008
- November 2008
- October 2008
- September 2008
- August 2008
- July 2008
- June 2008
- April 2008
- February 2008
- January 2008
- December 2007
- November 2007
- October 2007
- September 2007
- August 2007
- July 2007
- June 2007
- May 2007
- April 2007
- March 2007
- February 2007
- January 2007
- December 2006
- November 2006
- October 2006
- September 2006
- August 2006
- July 2006
- June 2006
Categories
The Pipeline
14th September 2006
As I’ve mentioned from time to time, as to the goal of seawater desalination research — my favorite idea is just a pipe with a semi permiable membrane that you stick out in some part of the ocean with a good coastal current or rip tide.
There’s nothing like it out there right now. What’s currently being built is an Under Ocean Floor Intake and Discharge Demonstration System at Long Beach California.
Together with its funding partners, Long Beach Water is also undertaking design and construction of an Under Ocean Floor Intake and Discharge Demonstration System, the first of its kind in the world, that will seek to demonstrate that viable, environmentally responsive intake and discharge systems can be developed along the coast of California.
That plant incidently expects to save 20%-30% in energy costs for RO.
Using a small 9,000 gallon-per-day pilot-scale desalter, the Long Beach Water Department has reduced the overall energy requirement (by 20 to 30 percent) of seawater desalination using a relatively low-pressure two staged nano-filtration process, developed by Long Beach Water engineers, known as the “Long Beach Method.”
This unique process is now being tested on a larger scale. With funding assistance from the United State Bureau of Reclamation and the Los Angeles Department of Water & Power, Long Beach Water is conducting research at a constructed 300,000 gallon-per-day, fully operational facility incorporating the two-stage nano-filtration process. This large-scale facility is needed to verify the energy savings when employing full-scale membranes and energy recovery units, among other things. The goal is to verify energy savings of the two-stage nano-filtration process and to optimize the process so that it can be duplicated.
But the Long Beach intake discharge system should only be considered first generation. So what’s the next generation? Interestingly enough, according to this article in photonics.com some researchers at the New Jersey Institute of Technology (NJIT) have used steel tubing to grow carbon nanotubes.
NEWARK, N.J., Aug. 7, 2006 — In less than 20 minutes, researchers can now seed, heat and grow carbon nanotubes in 10-foot-long, hollow thin steel tubing. The ground-breaking method will lead to improvements in cleaner gasoline, better food processing and faster, cheaper ways to clean air and water, the scientists said.
“The work took us three years to develop and get right, but now we can essentially anchor nanotubes to a tubular wall. No one has ever done anything like this before,” said lead researcher Somenath Mitra, PhD, professor and acting chair of the New Jersey Institute of Technology (NJIT) department of chemistry and environmental science. Graduate and post-doctoral students who worked on the project are Mahesh Karwa, Chutarat Saridara and Roman Brukh.
This is especially interesting because of the work at Lawrence Livermore announced back in June.
Researchers at Lawrence Livermore National Laboratory have created a membrane made of carbon nanotubes and silicon that may offer, among many possible applications, a less expensive desalination.
Scott Dougherty, LLNL Artist’s rendering of methane molecules flowing through a carbon nanotube less than two nanometers in diameter. (Click here to download a high-resolution image.) The nanotubes, special molecules made of carbon atoms in a unique arrangement, are hollow and more than 50,000 times thinner than a human hair. Billions of these tubes act as the pores in the membrane. The super smooth inside of the nanotubes allow liquids and gases to rapidly flow through, while the tiny pore size can block larger molecules. This previously unobserved phenomenon opens a vast array of possible applications.
The team was able to measure flows of liquids and gases by making a membrane on a silicon chip with carbon nanotube pores making up the holes of the membrane. The membrane is created by filling the gaps between aligned carbon nanotubes with a ceramic matrix material. The pores are so small that only six water molecules could fit across their diameter.
“The gas and water flows that we measured are 100 to 10,000 times faster than what classical models predict,” said Olgica Bakajin, the Livermore scientist who led the research. “This is like having a garden hose that can deliver as much water in the same amount of time as a fire hose that is ten times larger.”
Of course anything you stick out in the ocean is going to quickly encrust in barnicles algae and such. One solution, I’ve mentioned previously is sharkote — a US navy funded coating announced last year.
GAINESVILLE, Fla. — University of Florida engineers have developed an environmentally friendly coating for hulls of ocean-going ships based on an unlikely source of inspiration: the shark.
UF materials engineers tapped elements of sharks’ unique scales to design the new coating, which prevents the growth of a notoriously aggressive marine algae and may also impede barnacles, according to preliminary tests.
If more extensive testing and development bear out the results, the shark-inspired coating — composed of tiny scale-like elements that can actually flex in and out to impede growth — could replace conventional antifouling coatings. These coatings prevent marine growth but also leach poisonous copper into the ocean.
“The copper paints are wonderful in terms of keeping the ship surface clean, but they are poisonous and they accumulate at substantial rates in harbors,” threatening marine life, said Anthony Brennan, a UF professor of materials science and engineering and the lead developer of the coating. “By contrast, there are no toxins associated with our surface.”
Brennan’s project is being sponsored by the U.S. Navy, the world’s largest maritime ship owner, which has contributed at least $750,000 to the effort so far.
A National Science Foundation funded project at Rutgers Camden has recently developed a new polymer-coating process which might be appropriate for sharkote and the desalination pipes.
As gas prices continue to soar, the Navy will be eager to learn of research underway at Rutgers University–Camden. “Barnacles that attach to naval ships are a huge cost to the Navy. Imagine if you drove a car with a parachute attached; this extra drag force requires more gas,” says Daniel Bubb, an assistant professor of physics at Rutgers-Camden, who has developed a new method for coating polymers.
Used in a variety of industries, including protecting battleships from freeloading barnacles, polymers are materials made from long chains of molecules.
Thanks to a $129,463 National Science Foundation grant in its third year, Bubb and his team (including a post-doctoral fellow, undergraduate, and graduate students) are refining this new coating process. By employing a pulsed laser deposition technique, a high-power laser is focused onto a target material in a vacuum chamber, creating a plume of vaporized material. The object that is to be coated is placed in the path of the vapor. The Rutgers-Camden research team then tunes the laser to a specific vibrational mode of the polymer to ease the vaporization process and limit photochemical and photothermal damage.
This research will benefit many industries that rely solely on the most commonly used method of spin-coating, a viable technique for certain applications but inefficient for coating devices that are too large or small for its apparatus.
“With spin-coating, it’s difficult to layer and adhesion can be a problem” says Bubb, whose research also could improve biocompatibility in devices that require coating only on very specific and sensitive areas.
The Rutgers-Camden researcher also has advanced coating polymers that are too thermally sensitive by treating materials with a solvent before using the laser. This aspect of the research is funded through a $35,000 Cottrell College Science Award.
A model for moving from R&D to manufacturing might be a deal signed by Los Alamos National Labratories and CNT Technologies Inc. in which CNT bought the rights to some nano tech developed by the Los Alamos Labs.
![]()
Senators Pete Domenici and Jeff Bingaman
LANL has big plans for nanoscience
By ANDY LENDERMAN | The New Mexican
August 22, 2006 A Seattle company has bought the rights to a nanotechnology development at Los Alamos National Laboratory and plans to manufacture a new product in the city’s research park based on lightweight nanotubes that are 100 times stronger than steel.The lab has made some longer carbon nanotubes, which makes them easier
to weave into super-strong materials. A nanometer is one-billionth of a meter in size. A nanotube is a long carbon molecule and its typical size is about two to three nanometers in diameter and up to five millimeters long. The company has developed a product called SuperThread made of these nanotubes.“What we’re working with is nanotubes that are one to five millimeters long,” Tremper said. “But those are longer than anybody else’s at the moment. It’s the longer length that allows us to spin the fibers into threads and make a usable product.”
Tremper said his company plans to have a pilot plant based at Los Alamos Research Park within six months that will produce one kilogram of SuperThread a day.
“And that will allow us to give major quantities of samples to companies and government agencies that need material that is ultra strong and ultra light,” he said.
Full-scale production — if everything goes smoothly with the pilot project — would come in about 18 months.
Tremper said the pilot plant in Los Alamos would have 15 to 20 employees. He said it’s unclear where a full-scale production factory would be located, but he said the factory would have hundreds of employees. The company is seeking investors.
The lab researchers working on the technology and the company will be in the same building, Peterson said.
Kudos to Senators Pete Domenici and Jeff Bingaman for pushing nanotechnology research.
Also Monday, U.S. Sens. Pete Domenici, R-N.M., and Jeff Bingaman, D-N.M., announced a new federal nanotechnology research effort that will be based at New Mexico’s national laboratories.
Los Alamos National Laboratory received $18.3 million for a research center, and Sandia National Laboratories in Albuquerque received $57 million. The U.S. Department of Energy is establishing research centers at three other labs as well.
“It is vital that our nation remain competitive with the rest of the world when it comes to science and technology, so the work being done at DOE labs is particularly important,” Domenici said in a news release.
If you find this blog to be interesting/useful — please link to it from your website.
State of the Art
08th September 2006
This article in Science Magazine mentions many of the scientists on the distribution list for this blog–so if you have a hankering to see what everybody is up to–click here. The article gives a pretty good overview of the state of the art in desalination research today. Everybody is trying something different. And I think too that a number of different technologies will survive because the requirements for desalinising salty aquifers are different from desalinising seawater which is different from purifying munincipal or agricultural run off & waste.
One source of desalination savings in the future I think will be in the manufacturing process itself. For example, this article in Electronic News shows how the photovoltaic industry expects to drop prices.
Solar Moves Front and Center
Sep 5 2006 7:49AMSolar energy is about to get cheaper—much cheaper. In fact, the cost of installing solar panels on a roof is expected to drop to about a third of what it now costs over the next several years, turning an experimental industry into a mainstream boom.
In real dollars, that means the average residential installation will drop to $8,000 from the current $24,000, not including state and federal rebates
That said, photovoltaic cells are not like semi permiable membranes but they will both respond to economies of scale. As the article mentions the cost savings won’t come from solar technology itself.
The trigger this time isn’t the solar technology itself, which has shown only slow improvement in recent years. It’s the equipment used to make solar cells. Coupling that manufacturing equipment with the current processes used in making semiconductors is expected to add huge economies of scale.
Next week we’ll look into some interesting manufacturing methods.
Scudieri Engines, AutoCad Designs & Nano
02nd September 2006
There is a lot interesting stuff this past week.
Wired ran a piece on the Scuderi Engine which promises to double fuel efficency and drop the cost and weight the internal combustion engine. The company that developed the engine was formed by the children of a retired — and now deceased Massachusetts engineer and inventor named Carmelo Scuderi. They have recently received 1.2 million from the DoD and 12 million from private investors. The company is in talks with all the big auto makers. The facinating thing to me …. is that for all the buzz they don’t have a prototype. That won’t come out until next year. What they have is a computer simulation developed by Southwestern Research Institute and the Scuderi Group.
The Scuderi Engine might someday make for a cheaper pump.
Everyone’s heard the aphorism that form follows function. Wouldn’t it be nice to specify function for a design and have the computer spit out form. Autodesk last week claimed it could do just that.
Unlike “dysfunctional design” (a phrase coined by Ten Links editor-in-chief Roopinder Tara), functional design, according to Autodesk, “enables customers to create designs based on the functional requirements of a product before they commit to complex model geometry, allowing designers to put function before form.”
In theory, a designer will simply draw a symbolic representation of an object in simple lines and blocks (as shown below), then use input parameters to specify the object’s function. Then the CAD software — in this case, Autodesk Inventor — automatically generates the geometry. With this approach, Anagnost pointed out, “Simulation can occur at any stage, engineers focus on product function, and they model geometry only if necessary.”
The functional design approach of Autodesk Inventor, promoted by Autodesk at its recent Manufacturing Solutions Media Summit, uses a product’s function to automatically generate the required geometry.
I think this tool might save some development time.
Get this. According to the Journal of Applied Physics, Nanoscience may provide a way to engineer materials that are virtually defect-free – perfect, that is.
A scientist at North Carolina State University has discovered that the tiny grains comprising many bulk materials can potentially contain nearly zero structural imperfections when the grains are smaller than a certain critical size, typically a few to several nanometers.
Therefore, materials created with grains of the right size could be structurally flawless.
Pretty nifty? Might make for a great space elevator or pipeline. But I’m thinking — with that level of specificity/purity/control — it might also be possible to introduce impurities with more control so as to affect the charge of a membrane.
So how would you introduce impurities? I dunno.
But curiously, a method for the desktop printing of carbon nanotubes was anounced this past week by Rensselaer Polytechnic Institute.
Using an off-the-shelf inkjet printer, a team of scientists has developed a simple technique for printing patterns of carbon nanotubes on paper and plastic surfaces. The method … is described in the August 2006 issue of the journal Small.
Most current techniques to make nanotube-based devices require complex and expensive equipment. “Our results suggest new alternatives for fabricating nanotube patterns by simply printing the dissolved particles on paper or plastic surfaces,” said Robert Vajtai, a researcher with the Rensselaer Nanotechnology Center at Rensselaer Polytechnic Institute and corresponding author of the paper.
Vajtai and his colleagues at Rensselaer – along with a group of researchers led by Krisztiбn Kordбs and Gйza Tуth at the University of Oulu in Finland – have developed an approach that uses a commercial inkjet printer to deposit nanotubes onto various surfaces. They simply fill a conventional ink cartridge with a solution of carbon nanotubes dissolved in water, and then the printer produces a pattern just as if it was printing with normal ink. Because nanotubes are good conductors, the resulting images also are able to conduct electricity.
Smart Pigs
25th August 2006
Over the last several weeks there have been a spate of articles on the Alaska Pipeline. The pipeline was corroded and and needed to be cleaned. The tool used to inspect the pipeline is called a Smart Pig.
BP has expressed surprise at the amount of corrosion discovered on the Prudhoe Bay network.
The firm says it went nine years without using a robotic device called a “pig” to clean out its lines because company officials did not think the procedure was necessary.
Pigs are used frequently in Canada, and advertisements for such procedures, performed by service companies, appear often in trade publications.
And again here
Coffman’s reports show, among many things, that BP didn’t run a standard industry test using a robot called a “smart pig,” and that the company didn’t place “coupons” in proper locations. Coupons are pieces of metal inserted into the pipeline flow, and then inspected to determine if corrosion has occurred.
BP said it scheduled a smart pig test for the western line of Prudhoe Bay for the summer of 2006 after ultrasonic testing last fall pointed to the highest rate of corrosion in six years. But the line, which remains open, suffered the March oil spill before the test could be conducted.
BP executives say the corrosion on the pipelines was surprising, because those lines carried oil processed to remove the impurities that cause corrosion. “With that situation, we did not expect the severe corrosion we found,” said Steve Marshall, president of BP Exploration, at a legislative hearing in Anchorage last week.
I don’t know what genius called these things Smart Pigs. But the name is catchy. It conjures the same Ghost of Christmas Present that called Mr Scrooge’s servant Bob Cratchit and blessed him with a four bedroom house on 15 shillings a week.
huh?
Sounds frugal. Scrooge and Cratchit are frugal. Oil and oil pipelines are expensive. They can only get away with high construction and maintenance bills because of the high cost of oil.
How would you build a 1000 mile water pipeline that would go uphill inland from the ocean in such a way as to minimize the cost of construction and maintenance over 50-100 years, say, and thereby minimize the cost of the water.
Beats me.
But there are just a plethora of tools out there on the shop floor that could address this question and over time “evolve” — or model — some interesting solutions.
We have discussed modeling for a semi permiable membrane such that in the end you could just stick a pipe in the ocean and have only fresh water flow into the pipe. Another model might be for a machine that could suck any kind of dirt or rock into one end and extrude strong durable pipe out the other. Another model would be a search for the cheapest combination of passive and active pumping to make water flow inland uphill for 1000 miles. Another model would be for the cheapest/most long lasting way to coat the inner pipe so as to avoid corrosion, algae build up or sedimentation. Another model would be for the cheapest way to monitor and fix a break in the pipeline caused by, say, earthquakes, pipebombs or corrosion. Another model would be for the cheapest way to monitor water quality as it moves up the line. Hey this is fun.
And now two smaller Cratchits, boy and girl, came tearing in, screaming that outside the baker’s they had smelt the goose, and known it for their own; and basking in luxurious thoughts of sage and onion, these young Cratchits danced about the table, and exalted Master Peter Cratchit to the skies, while he (not proud, although his collars nearly choked him) blew the fire, until the slow potatoes bubbling up, knocked loudly at the saucepan-lid to be let out and peeled.
The Golden Age of Math
18th August 2006
There is a legitimate question to ask, I think, with regards to last weeks post about the predictions of Mihail Rocco senior advisor for the nanotechnology to the National Science Foundation. Why is he so confident that fantastic things are going to happen in materials research in five year intervals or so over the next 20 years.
The brief answer is that he is not only confident in the current generation tools and methodologies but he is also confident in the steady improvement of the the current generation tools and methodologies. We’ve already discussed computing power in 5-10 years–and how the the NSF & DOE are funding research to make super computers 1000 times more powerful than today–within 10 years. The second equally important part of this is that the National Science Foundation is funding all kinds of exotic math projects. These math projects form the basis for algorithms which lie at the heart of computer software programs.
This past week press releases on two NSF funded math projects came out which have bearing on desalination membrane/catalyst work. The first math project may have some bearing on sensors that can act in real time to give a complete picture of acivity across the entire surface of membranes/catalysts–with only a limited number of sensors. As Mathematician Robert Ghrist at the University of Illinois at Urbana-Champaign — puts it “Using topological tools, however, we can more easily stitch together information from the sensors to find and fill any holes in the network and guarantee that the system is safe and secure.”
Anyhow here is the PR
Mathematician uses topology to study abstract spaces, solve problems
CHAMPAIGN, Ill. — Studying complex systems, such as the movement of robots on a factory floor, the motion of air over a wing, or the effectiveness of a security network, can present huge challenges. Mathematician Robert Ghrist at the University of Illinois at Urbana-Champaign is developing advanced mathematical tools to simplify such tasks.
Ghrist uses a branch of mathematics called topology to study abstract spaces that possess many dimensions and solve problems that can’t be visualized normally. He will describe his technique in an invited talk at the International Congress of Mathematicians, to be held Aug. 23-30 in Madrid, Spain.
Ghrist, who also is a researcher at the university’s Coordinated Science Laboratory, takes a complex physical system – such as robots moving around a factory floor – and replaces it with an abstract space that has a specific geometric representation.
“To keep track of one robot, for example, we monitor its x and y coordinates in two-dimensional space,” Ghrist said. “Each additional robot requires two more pieces of information, or dimensions. So keeping track of three robots requires six dimensions. The problem is, we can’t visualize things that have six dimensions.”
Mathematicians nevertheless have spent the last 100 years developing tools for figuring out what abstract spaces of many dimensions look like.
“We use algebra and calculus to break these abstract spaces into pieces, figure out what the pieces look like, then put them back together and get a global picture of what the physical system is really doing,” Ghrist said.
Ghrist’s mathematical technique works on highly complex systems, such as roving sensor networks for security systems. Consisting of large numbers of stationary and mobile sensors, the networks must remain free of dead zones and security breaches.
Keeping track of the location and status of each sensor would be extremely difficult, Ghrist said. “Using topological tools, however, we can more easily stitch together information from the sensors to find and fill any holes in the network and guarantee that the system is safe and secure.”
While it may seem counterintuitive to initially translate such tasks into problems involving geometry, algebra or calculus, Ghrist said, that doing so ultimately produces a result that goes back to the physical system.
“That’s what applied mathematics has to offer,” Ghrist said. “As systems become increasingly complex, topological tools will become more and more relevant.”
A second PR on a NSF funded math project deals with the math of minimal surfaces. I reckon it would be used for modeling and simulation.
For most people, soap bubbles are little more than ethereal, ephemeral childhood amusements, or a bit of kitsch associated with the Lawrence Welk Show.
But for Johns Hopkins University mathematician William Minicozzi, the translucent film that automatically arranges itself into the least possible surface area on the bubble wand is an elegant and captivating illustration of a mathematical concept called “minimal surfaces.” A minimal surface is one with the smallest surface area that can span a boundary.
What does this have to do with membranes?
“Minimal surfaces come up in a lot of different physical problems, some more or less practical, but scientists have recently realized that they are extremely useful in nanotechnology,” he said. “They say that nanotechnology is the next Industrial Revolution and that it has the potential to alter many aspects of our lives, from how we are treated for illness to how we fulfill our energy needs and beyond. That’s why increasing numbers of material scientists and mathematicians are discovering minimal surfaces.”
Anyhow, the rest of the article is here.
A final note. In looking through — and considering — the flow of information on materials research for the last several weeks–it occurs to me that the feds are providing leadership — but not of a kind that’s generally recognized. So it may well be that the exasperation with federal leadership as expressed by the MIT official last week may have come as a result of not understanding what the feds are up to. Certainly it looks from here that the answers to water problems — like those of energy — will come from materials research.
Nanotechnology’s Future
11th August 2006
Mihail Roco, senior advisor for the nanotechnology to the National Science Foundation and a key architect of the National Nanotechnology Initiative–penned a piece in Scientific American this week. The article gives a roadmap for nano technology — which provides a good context for projecting the future of desalination research.
Today nanotechnology is still in a formative phase–not unlike the condition of computer science in the 1960s or biotechnology in the 1980s. Yet it is maturing rapidly.
Over the next couple of decades, nanotech will evolve through four overlapping stages of industrial prototyping and early commercialization. The first one, which began after 2000, involves the development of passive nanostructures: materials with steady structures and functions, often used as parts of a product. These can be as modest as the particles of zinc oxide in sunscreens, but they can also be reinforcing fibers in new composites or carbon nanotube wires in ultraminiaturized electronics.
The second stage, which began in 2005, focuses on active nanostructures that change their size, shape, conductivity or other properties during use. New drug-delivery particles could release therapeutic molecules in the body only after they reached their targeted diseased tissues. Electronic components such as transistors and amplifiers with adaptive functions could be reduced to single, complex molecules.
Starting around 2010, workers will cultivate expertise with systems of nanostructures, directing large numbers of intricate components to specified ends. One application could involve the guided self-assembly of nanoelectronic components into three-dimensional circuits and whole devices. Medicine could employ such systems to improve the tissue compatibility of implants, or to create scaffolds for tissue regeneration, or perhaps even to build artificial organs.
After 2015-2020, the field will expand to include molecular nanosystems–heterogeneous networks in which molecules and supramolecular structures serve as distinct devices. The proteins inside cells work together this way, but whereas biological systems are water-based and markedly temperature-sensitive, these molecular nanosystems will be able to operate in a far wider range of environments and should be much faster. Computers and robots could be reduced to extraordinarily small sizes. Medical applications might be as ambitious as new types of genetic therapies and antiaging treatments. New interfaces linking people directly to electronics could change telecommunications.
Over time, therefore, nanotechnology should benefit every industrial sector and health care field. It should also help the environment through more efficient use of resources and better methods of pollution control.
Its always helpful to glance over at doings in energy research. There, the sense of both urgency and opportunity is palpable. This week MIT announced their version of the Manhattan Project.
Scientists at MIT are undertaking a big, ambitious, university-wide program to develop innovative energy tech under the auspices of the university’s Energy Research Council.
“The urgent challenge of our time (is) clean, affordable energy to power the world,” said MIT President Susan Hockfield.
Inaugurated last year, the project is likened by Hockfield to MIT’s contribution to radar — a key technology that helped win World War II.
David Jhirad, a former deputy assistant secretary of energy and current VP for science and research at the World Resources Institute, said no other institution or government anywhere has taken on such an intensive, creative, broad-based, and wide-ranging energy research initiative.
“MIT is stepping into a vacuum, because there is no policy, vision or leadership at the top of our nation,” he said.
Mr. Jhirad may be overstating his case. Certainly I would hope and trust that the same thing could not be said of water desalination research.
Technorati Profile
10th August 2006
<script type=”text/javascript” src=”http://embed.technorati.com/embed/yq3v8xivbf.js”></script>
Back in June, San Dia Labs anounced that they’ll have a new desalination R&D roadmap. The press release is below. The roadmap isn’t out yet. So I’ll throw in my 2 cents on the matter.
The short of it is that imho the roadmap should announce that the goal of desalination R&D over the next 10 years is to make the cost of water desalination 1/10 of what it is today. The R&D roadmap should say that the US government is interested in laying the groundwork for making it economically possible to turn America’s deserts–and the world’s deserts–green and thereby make it economically feasible to increase the habitable size of the USA by 1/3 and double the size of the habitable earth.
Pretty ambitious? Not really. A drop in the cost of water desalination by a factor of -+10 in 10 years is baked into the research tools and methodologies available today. However, I think that the accelerating speed of computers will actually enable researchers to drop the cost of water desalination by a factor of +-10 in six years.
The situation today is analogous to the Human Genome Project initially proposed by the DOE in 1986 and launched in 1990 under both the DOE and the NIH. The goal was to complete the project in 15 years. The first rough draft was completed in 10 years. I think the desalination roadmap should aim for 10 years rather than 15 years because the power of the tools and the quality of the methodologies are many orders of magnitude higher than they were in 1990.
Senator Pete Domenici played an early role in this project. The vision of the founders was breath taking even when their technology was relatively primitive. The reason they could say “The ultimate goal of this initiative is to understand the human genome” was not just because there was a need to understand the human genome but also because they had the tools and methodology to get there. Same is true today for water desalination.
In 1998 a parallel project was launched by Craig Ventor and Celera Genomics. Competition from the private research group accelerated the pace of development.
A more detailed parallel history of the private Ventor and public DOE/NIH Human Genome Project is recounted below. The essential takeaway should be that the government provided the vision and leadership and Ventor et al provided the oomph. That said, the history is instructive:
Human Genome Project
History and ongoing developments
The Project was launched in 1986 by Charles DeLisi, who was then Director of the US Department of Energy’s Health and Environmental Research Programs (DeLisi later was awarded the Citizen’s medal by President Clinton for his seminal role in the Project). The goals and general strategy of the Project were outlined in a two-page memo to the Assistant Secretary in April 1986, which helped garner support from the DOE, the United States Office of Management and Budget (OMB) and the United States Congress, especially Senator Pete Domenici. A series of Scientific Advisory meetings, and complex negotiations with senior Federal officials resulted in a line item for the Project in the 1987 Presidential budget submission to the Congress.
Initiation of the Project was the culmination of several years of work supported by the US Department of Energy, in particular a feasibility workshop in 1986 and a subsequent detailed description of the Human Genome Initiative in a report that led to the formal sanctioning of the initiative by the Department of Energy.[1] This 1987 report stated boldly, “The ultimate goal of this initiative is to understand the human genome” and “Knowledge of the human genome is as necessary to the continuing progress of medicine and other health sciences as knowledge of human anatomy has been for the present state of medicine.” Candidate technologies were already being considered for the proposed undertaking at least as early as 1985.[2]
James D. Watson was Head of the National Center for Human Genome Research at the National Institutes of Health (NIH) in the United States starting from 1988. Largely due to his disagreement with his boss, Bernadine Healy, over the issue of patenting genes, he was forced to resign in 1992. He was replaced by Francis Collins in April 1993 and the name of the Center was changed to the National Human Genome Research Institute (NHGRI) in 1997.
The $3-billion project was formally founded in 1990 by the United States Department of Energy and the U.S. National Institutes of Health, and was expected to take 15 years. In addition to the United States, the international consortium comprised geneticists in China, France, Germany, Japan, and the United Kingdom.
Due to widespread international cooperation and advances in the field of genomics (especially in sequence analysis), as well as huge advances in computing technology, a ‘rough draft’ of the genome was finished in 2000 (announced jointly by then US president Bill Clinton and British Prime Minister Tony Blair on June 26, 2000).[3] Ongoing sequencing led to the announcement of the essentially complete genome in April 2003, two years earlier than planned.[4] In May 2006, another milestone was passed on the way to completion of the project, when the sequence of the last chromosome was published in the journal Nature.[5]
While most of the sequencing of the first human genome is “complete” the project to understand the functions of all the genes and their regulation is far from completion. The roles of junk DNA, the evolution of the genome, the differences between individuals and races, and many other questions are still the subject of intense study by laboratories all over the world.
The role of Celera Genomics
In 1998, an identical, privately funded quest was launched by the American researcher Craig Venter and his firm Celera Genomics. The $300 million Celera effort was intended to proceed at a faster pace and at a fraction of the cost of the roughly $3 billion taxpayer-funded project.
Celera used a newer, riskier technique called whole genome shotgun sequencing, which had been used to sequence bacterial genomes.
Celera initially announced that it would seek patent protection on “only 200-300” genes, but later amended this to seeking “intellectual property protection” on “fully-characterized important structures” amounting to 100-300 targets. Contrary to its public promises, the firm eventually filed patent applications on 6,500 whole or partial genes.
Celera also promised to publish their findings in accordance with the terms of the 1996 “Bermuda Statement,” by releasing new data quarterly (the HGP released its new data daily), although, unlike the publicly-funded project, they would not permit free redistribution or commercial use of the data.
In March 2000, President Clinton announced that the genome sequence could not be patented, and should be made freely available to all researchers. The statement sent Celera’s stock plummeting and dragged down the biotech-heavy Nasdaq. The biotech sector lost about $50 billion in market capitalization in two days.
Although the working draft was announced in June 2000, it was not until February 2001 that Celera and the HGP scientists published details of their drafts. Special issues of Nature (which published the publicly-funded project’s scientific paper) and Science (which published Celera’s paper) described the methods used to produce the draft sequence and offered analysis of the sequence. These drafts are hoped to comprise a ‘scaffold’ of 90% of the genome, with gaps to be filled later.
The competition proved to be very good for the project. The rivals agreed to pool their data, but the agreement fell apart when Celera refused to deposit its data in the unrestricted public database GenBank. Celera had incorporated the public data into their genome, but forbade the public effort to use Celera data.
On 14 April 2003, a joint press release announced that the project had been completed by both groups, with 99% of the genome sequenced with 99.99% accuracy.
…………………………….
And the rest is history:
To reiterate, imho the Desalination and Water Purification Roadmap — “Roadmap 2” should provide the same of government vision and leadership as was provided by the government in the Human Genome Project.
Below is the San Dia PR as published in EurekAlert.
Contact: Chris Burroughs
coburro@sandia.gov
505-844-0948
DOE/Sandia National Laboratories
Desalination roadmap seeks technological solutions to increase the nation’s water supply
Sandia researchers ready to complete research roadmap
ALBUQUERQUE, N.M. — After one last meeting in San Antonio in April, Sandia National Laboratories researchers Pat Brady and Tom Hinkebein are putting the final touches on the updated Desalination and Water Purification Roadmap — “Roadmap 2” — that should result in more fresh water in parts of the world where potable water is scarce.The updated roadmap is the result of three previous meetings — two in San Diego and one in Tampa — and the last held in April where many government agency, national laboratory, university and private partners gathered to map out the future of desalination in the U.S. The first roadmap identified overall goals and areas of desalination research and was submitted to Congress in 2003.
Brady expects the second roadmap to be completed shortly, and the Joint Water Reuse and Desalination Task Force will then submit it to Sen. Pete Domenici, R-N.M., chairman of the Senate Energy and Water Development Appropriations Subcommittee, Congress and eventually the water user and research communities. The task force consists of the Bureau of Reclamation, the WaterReuse Foundation, the American Water Works Association Research Foundation and Sandia.
The roadmap will recommend specific areas of potential water desalination research and development that may lead to technological solutions to water shortage problems.
“Population growth in the U.S. is expected to increase 13.6 percent per decade [over the next two decades],” says Hinkebein, manager of Sandia’s Geochemistry Department and head of Sandia’s Advanced Concepts Desalination Group. “There will be 29 percent more of us in 20 years. Put that together with an unequal distribution of people — more moving to Texas, California, Arizona and New Mexico where fresh water is limited — and it is easy to see we are facing a challenging water future.”
Sandia is a National Nuclear Security Administration laboratory.
Only 0.5 percent of Earth’s water is directly suitable for human consumption. The rest is composed of saltwater or locked up in glaciers and icecaps. As the world’s population grows, the increased water demand will have to come from someplace. Brackish water seems to be a natural source, Hinkebein says.
Roadmap 2 will outline the specific research needed in high-impact areas to create more fresh water from currently undrinkable brackish water, from seawater, and from wastewater. It will ensure that different organizations are not duplicating research.
Water desalination is not a new concept. In the U.S., the largest plants are in El Paso and Tampa. It is also commonplace in other parts of the world. Except for the Middle East, most desalination is done through reverse osmosis.
Brady says 43 research areas have been tentatively identified and some projects are already under way, jump started with $2 million made available for the preliminary research through a matching grant from the California Department of Water Resources. California provided $1 million and members of the Joint Water Reuse and Desalination Task Force each contributed $250,000.
Another $4 million in fiscal years 2004, 2005 and 2006 through federal Energy and Water Development Appropriations bills secured by Domenici has also funded desalination research at Sandia.
“The task force will decide which of the 43 projects get to the top of the research pile,” Brady says. “As more money is made available, universities, research groups, national laboratories and private companies will bid on projects.”
The 43 research areas in Roadmap 2 include the following:
- Membrane technologies (mainly reverse osmosis) that desalinate and purify water by pushing it through a semipermeable membrane that removes contaminants.
- Alternative technologies that take advantage of nontraditional methods.
- Concentrate management technologies that consider the disposal and/or beneficial use of desalination waste streams.
- Reuse/recycling technologies that look at ways membrane and alternative technologies can be used to more efficiently recycle water.
Much of the research could be conducted at the soon-to-be-completed Tularosa Basin National Desalination Research Facility in Alamogordo.
Nanotechnology simulations show what experiments miss
22nd June 2006
Last month Oak Ridge National Laboratory showed how they could evolve computer modeling/simulations in non-desalination- related materials research.
Last week Sandia Labs weighed in on the role of modeling/simulations for water desalination. The forum in this case was the Materials Research Society at its recent semiannual general meeting. I’ve posted the article below and here is a link to the press release. This PR bears comment.
The two most important points are that modeling/simulation results are now 1.)as good — or better– than lab observation & experimentation and 2.) San Dia Labs — like Oak Ridge and others– is automating the way they evaluate their modeling/simulationn results.
There are some stupendous kickers here that San Dia’s modelers mention.
“We need to sit back and put our mindset in a different mode,” he told his audience. “We’re all too busy doing [laboratory] research [instead of considering] how we can leverage resources to push our science to the next level.”
What did he mean? Perhaps he meant that models are sufficiently reliable — such that the pace of research can be greatly accelerated by dropping most physical experimentation until the models show a result scientists need to observe or replicate in their labs. The second consequence mentioned below is the relative ease by which models can be tweaked to get new results. ie instead of rebuilding a new model/simulation from scratch modelers can go to a database or databases or libraries of models/simulions and pull the closest model/simulation “off the shelf”. A modeler/simulator could tweak the model to fit the scientist’s requirements. If the results are good then the scientist can go to oberservation/creation.
There is also the matter of cost, says Fang: “With smart people developing numerical methods, models, and algorithms to use computers to study real cases, we find we can rerun calculations merely by changing computer parameters.
If the model doesn’t work then its back to the drawing board. In either case the new model is added to the library.
Of course the closest model/simulation may not be at San Dia Labs. It might be at Ames or Oak Ridge or where ever. And the model might not be directly related to desalination. Rather the model/ simulation might be something related to surfaces, catalysts or semipermiable membranes that are of interest to, say, hydrogen modelers
Linking modeling libraries of one or more institutions would be tough. Less tough for the desalination community might be to set up wheels and spoke relationship with the San Dia Modeling Center. Or maybe just a website could be created where scientists could post their modeling/simulation requirements–with some kind of billing/bidding system set up to connect them to San Dia. Or desalination scientists could post their model/simulation requirements at problem solvers like InnoCentive, YourEncore, & NineSigma.
Anyhow, here is the San Dia Press Release.
Nanotechnology simulations show what experiments miss
Nanotechnology simulations show what experiments miss
Rendering of Sandia simulations by Michael Chandross demonstrates significant transfer of material to the probe tip of an atomic force microscope.
Taking issue with the perception that computer models lack realism, a Sandia National Laboratories researcher told his audience that simulations of the nanoscale provide researchers more detailed results — not less — than experiments alone.
The invited talk by Eliot Fang was delivered to members of the Materials Research Society at its recent semiannual general meeting.
Fang derided the pejorative “garbage in, garbage out” description of computer modeling — the belief that inputs for computer simulations are so generic that outcomes fail to generate the unexpected details found only by actual experiment.
Fang not only denied this truism but reversed it. “There’s another, prettier world beyond what the SEM [scanning electron microscope] shows, and it’s called simulation,” he told his audience. “When you look through a microscope, you don’t see some things that modeling and simulation show.”
This change in the position of simulations in science — from weak sister to an ace card — is a natural outcome of improvements in computing, Fang says. “Fifteen years ago, the Cray YMP [supercomputer] was the crown jewel; it’s now equivalent to a PDA we have in our pocket.”
No one denies that experiments are as important as simulations — “equal partners, in fact,” says Julia Phillips, director of Sandia’s Physical, Chemical, and Nanosciences Center.
But the Labs’ current abilities to run simulations with thousands, millions, and even billions of atoms have led to insights that would otherwise not have occurred, Fang says.
For example, one simulation demonstrated that a tiny but significant amount of material had transferred onto the tip of an atomic force microscope (AFM) as it examined the surface of a microsystem.
“The probe tip changed something very, very tiny on the surface of the material,” says Fang. “It was almost not noticeable. But the property of the surface became very different.”
Laboratory observation couldn’t identify the cause of the property change, but computer simulations provided a reasonable explanation of the results.
As for predicting the reliability of materials that coat surfaces, Fang says, “We find that when we compare our simulation models with data from the experiments, we get a more complete understanding.”
Says Sandia Fellow and materials researcher Jeff Brinker, “We use simulations quite a bit in support of Sandia’s water purification program and the NIH Nano-Medicine Center program. In all these cases I’m working with theorists and modelers to guide the design of synthetic nanopores so as to develop transport behaviors approaching those of natural water or ion channels that exist in cell membranes.”
How is this understanding achieved?
Models computationally link a variety of size and time scales to create an experimental design.
“We use as much experimental information as possible to validate our methods,” says Alex Slepoy from Sandia’s Multiscale Computational Materials Methods. “The trick is picking a correct modeling strategy from our toolbox of methods.”
Asked whether simulations are merely more complex versions of what a graphic artist produces — a product of the imagination, in short, that cannot accurately produce new details — Slepoy provisionally entertains the idea: “A graphic artist has to make choices that are somewhat subconscious: what size objects to represent, how close-in to zoom, what details to include and exclude, and are there people out there who liked what he drew. So do we.
“But there the similarity ends. For us in computer simulations, the questions are more technical: Does the modeling strategy agree with experiments and is it consistent with established models? Does it have mathematical consistency?”
A further advance in accurate model development, he says, is that “now we’re developing automated methods to tell us whether we’ve satisfied [accuracy] requirements, rather than doing that by just manually looking at results. The method automatically tunes the model to satisfy the entire set of conditions as we know them.”
There is also the matter of cost, says Fang: “With smart people developing numerical methods, models, and algorithms to use computers to study real cases, we find we can rerun calculations merely by changing computer parameters. Thus the cost to push science forward is much cheaper than running experiments — particularly in nanoscience, where the realm is so small that experiments are difficult to perform, testing devices are not available, and data acquisition is a challenge.”
For all these reasons, he says, “This is why at CINT [the Sandia/Los Alamos Center for Integrated Nanotechnology, funded by DOE’s Office of Science], theory and simulation is one of its five thrusts. People view modeling and simulation as a critical component of nanoscience.”
“We need to sit back and put our mindset in a different mode,” he told his audience. “We’re all too busy doing [laboratory] research [instead of considering] how we can leverage resources to push our science to the next level.”
Modeling tools include: meso-scale (an intermediate resolution capability functioning between the atomic and macro scales), classical atomistics (classical force-field theory), Density Functional Theory (a one-electron approximation of quantum theory, where an electron interacts with atoms but not with another electron), and the full quantum model (electrons interacting with other electrons and four or five ions).
Source: Sandia National Laboratories
This news is brought to you by PhysOrg.com
The pace of water desalination R&D research will accelerate.
22nd June 2006
The pace of Desalination R&D is going to accelerate over the next several years. Professionals will want to check in regularly to find out the latest.