Smart Pigs

25th August 2006

Over the last several weeks there have been a spate of articles on the Alaska Pipeline. The pipeline was corroded and and needed to be cleaned. The tool used to inspect the pipeline is called  a Smart Pig.

BP has expressed surprise at the amount of corrosion discovered on the Prudhoe Bay network.

The firm says it went nine years without using a robotic device called a “pig” to clean out its lines because company officials did not think the procedure was necessary.

Pigs are used frequently in Canada, and advertisements for such procedures, performed by service companies, appear often in trade publications.

And again here

Coffman’s reports show, among many things, that BP didn’t run a standard industry test using a robot called a “smart pig,” and that the company didn’t place “coupons” in proper locations. Coupons are pieces of metal inserted into the pipeline flow, and then inspected to determine if corrosion has occurred.

BP said it scheduled a smart pig test for the western line of Prudhoe Bay for the summer of 2006 after ultrasonic testing last fall pointed to the highest rate of corrosion in six years. But the line, which remains open, suffered the March oil spill before the test could be conducted.

BP executives say the corrosion on the pipelines was surprising, because those lines carried oil processed to remove the impurities that cause corrosion. “With that situation, we did not expect the severe corrosion we found,” said Steve Marshall, president of BP Exploration, at a legislative hearing in Anchorage last week.

I don’t know what genius called these things Smart Pigs. But the name is catchy. It conjures the same Ghost of Christmas Present that called Mr Scrooge’s servant Bob Cratchit and blessed him with a four bedroom house on 15 shillings a week.

huh?

Sounds frugal. Scrooge and Cratchit are frugal. Oil and oil pipelines are expensive. They can only get away with high construction and maintenance bills because of the high cost of oil.

How would you build a 1000 mile water pipeline that would go uphill inland from the ocean in such a way as to minimize the cost of construction and maintenance over 50-100 years, say, and thereby minimize the cost of the water.

Beats me.

But there are just a plethora of tools out there on the shop floor that could address this question and over time “evolve” — or model — some interesting solutions.

We have discussed modeling for a semi permiable membrane such that in the end you could just stick a pipe in the ocean and have only fresh water flow into the pipe. Another model might be for a machine that could suck any kind of dirt or rock into one end and extrude strong durable pipe out the other. Another model would be a search for the cheapest combination of passive and active pumping to make water flow inland uphill for 1000 miles. Another model would be for the cheapest/most long lasting way to coat the inner pipe so as to avoid corrosion, algae build up or sedimentation. Another model would be for the cheapest way to monitor and fix a break in the pipeline caused by, say, earthquakes, pipebombs or corrosion. Another model would be for the cheapest way to monitor water quality as it moves up the line. Hey this is fun.

And now two smaller Cratchits, boy and girl, came tearing in, screaming that outside the baker’s they had smelt the goose, and known it for their own; and basking in luxurious thoughts of sage and onion, these young Cratchits danced about the table, and exalted Master Peter Cratchit to the skies, while he (not proud, although his collars nearly choked him) blew the fire, until the slow potatoes bubbling up, knocked loudly at the saucepan-lid to be let out and peeled.

The Golden Age of Math

18th August 2006

There is a legitimate question to ask, I think, with regards to last weeks post about the predictions of Mihail Rocco senior advisor for the nanotechnology to the National Science Foundation. Why is he so confident that fantastic things are going to happen in materials research in five year intervals or so over the next 20 years.

The brief answer is that he is not only confident in the current generation tools and methodologies but he is also confident in the steady improvement of the the current generation tools and methodologies. We’ve already discussed computing power in 5-10 years–and how the the NSF & DOE are funding research to make super computers 1000 times more powerful than today–within 10 years. The second equally important part of this is that the National Science Foundation is funding all kinds of exotic math projects. These math projects form the basis for algorithms which lie at the heart of computer software programs.

This past week press releases on two NSF funded math projects came out which have bearing on desalination membrane/catalyst work. The first math project may have some bearing on sensors that can act in real time to give a complete picture of acivity across the entire surface of membranes/catalysts–with only a limited number of sensors. As Mathematician Robert Ghrist at the University of Illinois at Urbana-Champaign — puts it “Using topological tools, however, we can more easily stitch together information from the sensors to find and fill any holes in the network and guarantee that the system is safe and secure.”

Anyhow here is the PR

Mathematician uses topology to study abstract spaces, solve problems

CHAMPAIGN, Ill. — Studying complex systems, such as the movement of robots on a factory floor, the motion of air over a wing, or the effectiveness of a security network, can present huge challenges. Mathematician Robert Ghrist at the University of Illinois at Urbana-Champaign is developing advanced mathematical tools to simplify such tasks.

Ghrist uses a branch of mathematics called topology to study abstract spaces that possess many dimensions and solve problems that can’t be visualized normally. He will describe his technique in an invited talk at the International Congress of Mathematicians, to be held Aug. 23-30 in Madrid, Spain.

Ghrist, who also is a researcher at the university’s Coordinated Science Laboratory, takes a complex physical system – such as robots moving around a factory floor – and replaces it with an abstract space that has a specific geometric representation.

“To keep track of one robot, for example, we monitor its x and y coordinates in two-dimensional space,” Ghrist said. “Each additional robot requires two more pieces of information, or dimensions. So keeping track of three robots requires six dimensions. The problem is, we can’t visualize things that have six dimensions.”

Mathematicians nevertheless have spent the last 100 years developing tools for figuring out what abstract spaces of many dimensions look like.

“We use algebra and calculus to break these abstract spaces into pieces, figure out what the pieces look like, then put them back together and get a global picture of what the physical system is really doing,” Ghrist said.

Ghrist’s mathematical technique works on highly complex systems, such as roving sensor networks for security systems. Consisting of large numbers of stationary and mobile sensors, the networks must remain free of dead zones and security breaches.

Keeping track of the location and status of each sensor would be extremely difficult, Ghrist said. “Using topological tools, however, we can more easily stitch together information from the sensors to find and fill any holes in the network and guarantee that the system is safe and secure.”

While it may seem counterintuitive to initially translate such tasks into problems involving geometry, algebra or calculus, Ghrist said, that doing so ultimately produces a result that goes back to the physical system.

“That’s what applied mathematics has to offer,” Ghrist said. “As systems become increasingly complex, topological tools will become more and more relevant.”

A second PR on a NSF funded math project deals with the math of minimal surfaces. I reckon it would be used for modeling and simulation.

For most people, soap bubbles are little more than ethereal, ephemeral childhood amusements, or a bit of kitsch associated with the Lawrence Welk Show.

But for Johns Hopkins University mathematician William Minicozzi, the translucent film that automatically arranges itself into the least possible surface area on the bubble wand is an elegant and captivating illustration of a mathematical concept called “minimal surfaces.” A minimal surface is one with the smallest surface area that can span a boundary.

What does this have to do with membranes?

“Minimal surfaces come up in a lot of different physical problems, some more or less practical, but scientists have recently realized that they are extremely useful in nanotechnology,” he said. “They say that nanotechnology is the next Industrial Revolution and that it has the potential to alter many aspects of our lives, from how we are treated for illness to how we fulfill our energy needs and beyond. That’s why increasing numbers of material scientists and mathematicians are discovering minimal surfaces.”

Anyhow, the rest of the article is here.

A final note. In looking through — and considering — the flow of information on materials research for the last several weeks–it occurs to me that the feds are providing leadership — but not of a kind that’s generally recognized. So it may well be that the exasperation with federal leadership as expressed by the MIT official last week may have come as a result of not understanding what the feds are up to. Certainly it looks from here that the answers to water problems — like those of energy — will come from materials research.

Nanotechnology’s Future

11th August 2006

Mihail Roco, senior advisor for the nanotechnology to the National Science Foundation and a key architect of the National Nanotechnology Initiative–penned a piece in Scientific American this week. The article gives a roadmap for nano technology — which provides a good context for projecting the future of desalination research.

Today nanotechnology is still in a formative phase–not unlike the condition of computer science in the 1960s or biotechnology in the 1980s. Yet it is maturing rapidly.

Over the next couple of decades, nanotech will evolve through four overlapping stages of industrial prototyping and early commercialization. The first one, which began after 2000, involves the development of passive nanostructures: materials with steady structures and functions, often used as parts of a product. These can be as modest as the particles of zinc oxide in sunscreens, but they can also be reinforcing fibers in new composites or carbon nanotube wires in ultraminiaturized electronics.

The second stage, which began in 2005, focuses on active nanostructures that change their size, shape, conductivity or other properties during use. New drug-delivery particles could release therapeutic molecules in the body only after they reached their targeted diseased tissues. Electronic components such as transistors and amplifiers with adaptive functions could be reduced to single, complex molecules.

Starting around 2010, workers will cultivate expertise with systems of nanostructures, directing large numbers of intricate components to specified ends. One application could involve the guided self-assembly of nanoelectronic components into three-dimensional circuits and whole devices. Medicine could employ such systems to improve the tissue compatibility of implants, or to create scaffolds for tissue regeneration, or perhaps even to build artificial organs.

After 2015-2020, the field will expand to include molecular nanosystems–heterogeneous networks in which molecules and supramolecular structures serve as distinct devices. The proteins inside cells work together this way, but whereas biological systems are water-based and markedly temperature-sensitive, these molecular nanosystems will be able to operate in a far wider range of environments and should be much faster. Computers and robots could be reduced to extraordinarily small sizes. Medical applications might be as ambitious as new types of genetic therapies and antiaging treatments. New interfaces linking people directly to electronics could change telecommunications.

Over time, therefore, nanotechnology should benefit every industrial sector and health care field. It should also help the environment through more efficient use of resources and better methods of pollution control.

Its always helpful to glance over at doings in energy research. There, the sense of both urgency and opportunity is palpable. This week MIT announced their version of the Manhattan Project.

Scientists at MIT are undertaking a big, ambitious, university-wide program to develop innovative energy tech under the auspices of the university’s Energy Research Council.

“The urgent challenge of our time (is) clean, affordable energy to power the world,” said MIT President Susan Hockfield.

Inaugurated last year, the project is likened by Hockfield to MIT’s contribution to radar — a key technology that helped win World War II.

David Jhirad, a former deputy assistant secretary of energy and current VP for science and research at the World Resources Institute, said no other institution or government anywhere has taken on such an intensive, creative, broad-based, and wide-ranging energy research initiative.

“MIT is stepping into a vacuum, because there is no policy, vision or leadership at the top of our nation,” he said.

Mr. Jhirad may be overstating his case. Certainly I would hope and trust that the same thing could not be said of water desalination research.

Technorati Profile

10th August 2006

Technorati Profile

<script type=”text/javascript” src=”http://embed.technorati.com/embed/yq3v8xivbf.js”></script>

Back in June, San Dia Labs anounced that they’ll have a new desalination R&D roadmap. The press release is below. The roadmap isn’t out yet. So I’ll throw in my 2 cents on the matter.

The short of it is that imho the roadmap should announce that the goal of desalination R&D over the next 10 years is to make the cost of water desalination 1/10 of what it is today. The R&D roadmap should say that the US government is interested in laying the groundwork for making it economically possible to turn America’s deserts–and the world’s deserts–green and thereby make it economically feasible to increase the habitable size of the USA by 1/3 and double the size of the habitable earth.

Pretty ambitious? Not really. A drop in the cost of water desalination by a factor of -+10 in 10 years is baked into the research tools and methodologies available today. However, I think that the accelerating speed of computers will actually enable researchers to drop the cost of water desalination by a factor of +-10 in six years.

The situation today is analogous to the Human Genome Project initially proposed by the DOE in 1986 and launched in 1990 under both the DOE and the NIH. The goal was to complete the project in 15 years. The first rough draft was completed in 10 years. I think the desalination roadmap should aim for 10 years rather than 15 years because the power of the tools and the quality of the methodologies are many orders of magnitude higher than they were in 1990.

Senator Pete Domenici played an early role in this project. The vision of the founders was breath taking even when their technology was relatively primitive. The reason they could say “The ultimate goal of this initiative is to understand the human genome” was not just because there was a need to understand the human genome but also because they had the tools and methodology to get there. Same is true today for water desalination.

In 1998 a parallel project was launched by Craig Ventor and Celera Genomics. Competition from the private research group accelerated the pace of development.
A more detailed parallel history of the private Ventor and public DOE/NIH Human Genome Project is recounted below. The essential takeaway should be that the government provided the vision and leadership and Ventor et al provided the oomph. That said, the history is instructive:

Human Genome Project

History and ongoing developments

The Project was launched in 1986 by Charles DeLisi, who was then Director of the US Department of Energy’s Health and Environmental Research Programs (DeLisi later was awarded the Citizen’s medal by President Clinton for his seminal role in the Project). The goals and general strategy of the Project were outlined in a two-page memo to the Assistant Secretary in April 1986, which helped garner support from the DOE, the United States Office of Management and Budget (OMB) and the United States Congress, especially Senator Pete Domenici. A series of Scientific Advisory meetings, and complex negotiations with senior Federal officials resulted in a line item for the Project in the 1987 Presidential budget submission to the Congress.

Initiation of the Project was the culmination of several years of work supported by the US Department of Energy, in particular a feasibility workshop in 1986 and a subsequent detailed description of the Human Genome Initiative in a report that led to the formal sanctioning of the initiative by the Department of Energy.[1] This 1987 report stated boldly, “The ultimate goal of this initiative is to understand the human genome” and “Knowledge of the human genome is as necessary to the continuing progress of medicine and other health sciences as knowledge of human anatomy has been for the present state of medicine.” Candidate technologies were already being considered for the proposed undertaking at least as early as 1985.[2]

James D. Watson was Head of the National Center for Human Genome Research at the National Institutes of Health (NIH) in the United States starting from 1988. Largely due to his disagreement with his boss, Bernadine Healy, over the issue of patenting genes, he was forced to resign in 1992. He was replaced by Francis Collins in April 1993 and the name of the Center was changed to the National Human Genome Research Institute (NHGRI) in 1997.

The $3-billion project was formally founded in 1990 by the United States Department of Energy and the U.S. National Institutes of Health, and was expected to take 15 years. In addition to the United States, the international consortium comprised geneticists in China, France, Germany, Japan, and the United Kingdom.

Due to widespread international cooperation and advances in the field of genomics (especially in sequence analysis), as well as huge advances in computing technology, a ‘rough draft’ of the genome was finished in 2000 (announced jointly by then US president Bill Clinton and British Prime Minister Tony Blair on June 26, 2000).[3] Ongoing sequencing led to the announcement of the essentially complete genome in April 2003, two years earlier than planned.[4] In May 2006, another milestone was passed on the way to completion of the project, when the sequence of the last chromosome was published in the journal Nature.[5]

While most of the sequencing of the first human genome is “complete” the project to understand the functions of all the genes and their regulation is far from completion. The roles of junk DNA, the evolution of the genome, the differences between individuals and races, and many other questions are still the subject of intense study by laboratories all over the world.

The role of Celera Genomics

In 1998, an identical, privately funded quest was launched by the American researcher Craig Venter and his firm Celera Genomics. The $300 million Celera effort was intended to proceed at a faster pace and at a fraction of the cost of the roughly $3 billion taxpayer-funded project.

Celera used a newer, riskier technique called whole genome shotgun sequencing, which had been used to sequence bacterial genomes.

Celera initially announced that it would seek patent protection on “only 200-300” genes, but later amended this to seeking “intellectual property protection” on “fully-characterized important structures” amounting to 100-300 targets. Contrary to its public promises, the firm eventually filed patent applications on 6,500 whole or partial genes.

Celera also promised to publish their findings in accordance with the terms of the 1996 “Bermuda Statement,” by releasing new data quarterly (the HGP released its new data daily), although, unlike the publicly-funded project, they would not permit free redistribution or commercial use of the data.

In March 2000, President Clinton announced that the genome sequence could not be patented, and should be made freely available to all researchers. The statement sent Celera’s stock plummeting and dragged down the biotech-heavy Nasdaq. The biotech sector lost about $50 billion in market capitalization in two days.

Although the working draft was announced in June 2000, it was not until February 2001 that Celera and the HGP scientists published details of their drafts. Special issues of Nature (which published the publicly-funded project’s scientific paper) and Science (which published Celera’s paper) described the methods used to produce the draft sequence and offered analysis of the sequence. These drafts are hoped to comprise a ‘scaffold’ of 90% of the genome, with gaps to be filled later.

The competition proved to be very good for the project. The rivals agreed to pool their data, but the agreement fell apart when Celera refused to deposit its data in the unrestricted public database GenBank. Celera had incorporated the public data into their genome, but forbade the public effort to use Celera data.

On 14 April 2003, a joint press release announced that the project had been completed by both groups, with 99% of the genome sequenced with 99.99% accuracy.

…………………………….

And the rest is history:

To reiterate, imho the Desalination and Water Purification Roadmap — “Roadmap 2” should provide the same of government vision and leadership as was provided by the government in the Human Genome Project.

Below is the San Dia PR as published in EurekAlert.
Contact: Chris Burroughs
coburro@sandia.gov
505-844-0948
DOE/Sandia National Laboratories

Desalination roadmap seeks technological solutions to increase the nation’s water supply

Sandia researchers ready to complete research roadmap

ALBUQUERQUE, N.M. — After one last meeting in San Antonio in April, Sandia National Laboratories researchers Pat Brady and Tom Hinkebein are putting the final touches on the updated Desalination and Water Purification Roadmap — “Roadmap 2” — that should result in more fresh water in parts of the world where potable water is scarce.The updated roadmap is the result of three previous meetings — two in San Diego and one in Tampa — and the last held in April where many government agency, national laboratory, university and private partners gathered to map out the future of desalination in the U.S. The first roadmap identified overall goals and areas of desalination research and was submitted to Congress in 2003.

Brady expects the second roadmap to be completed shortly, and the Joint Water Reuse and Desalination Task Force will then submit it to Sen. Pete Domenici, R-N.M., chairman of the Senate Energy and Water Development Appropriations Subcommittee, Congress and eventually the water user and research communities. The task force consists of the Bureau of Reclamation, the WaterReuse Foundation, the American Water Works Association Research Foundation and Sandia.

The roadmap will recommend specific areas of potential water desalination research and development that may lead to technological solutions to water shortage problems.

“Population growth in the U.S. is expected to increase 13.6 percent per decade [over the next two decades],” says Hinkebein, manager of Sandia’s Geochemistry Department and head of Sandia’s Advanced Concepts Desalination Group. “There will be 29 percent more of us in 20 years. Put that together with an unequal distribution of people — more moving to Texas, California, Arizona and New Mexico where fresh water is limited — and it is easy to see we are facing a challenging water future.”

Sandia is a National Nuclear Security Administration laboratory.

Only 0.5 percent of Earth’s water is directly suitable for human consumption. The rest is composed of saltwater or locked up in glaciers and icecaps. As the world’s population grows, the increased water demand will have to come from someplace. Brackish water seems to be a natural source, Hinkebein says.

Roadmap 2 will outline the specific research needed in high-impact areas to create more fresh water from currently undrinkable brackish water, from seawater, and from wastewater. It will ensure that different organizations are not duplicating research.

Water desalination is not a new concept. In the U.S., the largest plants are in El Paso and Tampa. It is also commonplace in other parts of the world. Except for the Middle East, most desalination is done through reverse osmosis.

Brady says 43 research areas have been tentatively identified and some projects are already under way, jump started with $2 million made available for the preliminary research through a matching grant from the California Department of Water Resources. California provided $1 million and members of the Joint Water Reuse and Desalination Task Force each contributed $250,000.

Another $4 million in fiscal years 2004, 2005 and 2006 through federal Energy and Water Development Appropriations bills secured by Domenici has also funded desalination research at Sandia.

“The task force will decide which of the 43 projects get to the top of the research pile,” Brady says. “As more money is made available, universities, research groups, national laboratories and private companies will bid on projects.”

The 43 research areas in Roadmap 2 include the following:

  • Membrane technologies (mainly reverse osmosis) that desalinate and purify water by pushing it through a semipermeable membrane that removes contaminants.
  • Alternative technologies that take advantage of nontraditional methods.
  • Concentrate management technologies that consider the disposal and/or beneficial use of desalination waste streams.
  • Reuse/recycling technologies that look at ways membrane and alternative technologies can be used to more efficiently recycle water.

Much of the research could be conducted at the soon-to-be-completed Tularosa Basin National Desalination Research Facility in Alamogordo.