Recent Posts
- Peace Through Water Desalination
- CPAC Water Policy Interview with KLRN Radio San Antonio Texas
- CPAC Water Interview With California Talk Show Host Rick Trader
- Toward a Green Earth Policy in the era of Trump
- Gates Foundation Water Energy Vision
Recent Comments
- LLNL Researchers use carbon nanotubes for molecular transport on
- Greenhouses for Desalination on
- American Membrane Technology Association on
- Engineers develop revolutionary nanotech water desalination membrane on
- LLNL Researchers use carbon nanotubes for molecular transport on
Archives
- May 2017
- March 2017
- June 2011
- December 2008
- November 2008
- October 2008
- September 2008
- August 2008
- July 2008
- June 2008
- April 2008
- February 2008
- January 2008
- December 2007
- November 2007
- October 2007
- September 2007
- August 2007
- July 2007
- June 2007
- May 2007
- April 2007
- March 2007
- February 2007
- January 2007
- December 2006
- November 2006
- October 2006
- September 2006
- August 2006
- July 2006
- June 2006
Categories
Computer Power in 5-10 Years.
30th June 2006
Earlier in June Elliot Fang mentioned at a Material Research Conference ““Fifteen years ago, the Cray YMP [supercomputer] was the crown jewel; it’s now equivalent to a PDA we have in our pocket.”
So where wil computers be in five to ten years? According to the Seattle Times:
Monday, June 26, 2006
WASHINGTON — The federal government is pushing computer scientists and engineers to step up the speed and capacity of America’s supercomputers.
Officials say much faster performance is needed to handle a looming tidal wave of scientific, technical and military data.
“Within the next five to 10 years, computers 1,000 times faster than today’s computers will become available. These advances herald a new era in scientific computing,” according to Raymond Orbach, undersecretary for science at the Department of Energy.
Interestingly, the DOE is interested in issues that are very close to those of the desalination community.
The Department of Energy also is offering $70 million in grants for teams of computer scientists and engineers to develop petascale software and data-management tools.
“The scientific problems are there to be solved, and petascale computers are on the horizon,” said Walter Polansky, senior technical adviser in the department’s Office of Advanced Scientific Computing.
For example, the Energy Department wants ultrafast computers to determine the 3-D structure of molecules that let drugs pass through cell walls, knowledge that can be vital against cancer.
This sort of knowledge would be vital to desalination research as well.
For more information online, go to the National Science Foundation program, at www.nsf.gov/pubs/2005/nsf05625/nsf05625.htm, or the Department of Energy program, at www.scidac.org.
Another possible funding resource would be the New Fund created by the Warren Buffet and Bill Gates. According to the Gates Foundation Grand Challenges in Global Health Backgrounder
A panel of international experts has identified 14 major scientific challenges that, if solved, could lead to breakthroughs in improving global health. The challenges include developing vaccines that do not require refrigeration, preventing insects from transmitting disease, and growing healthy crops in harsh climates. To achieve these breakthroughs, the foundation supports the Grand Challenges in Global Health initiative.
Think the Gates Foundation might be interested in funding computer modeling that would lead to cheap/fast/durable desalination membranes & catalysts–that would in turn lead to cheap desalinised water that would help famers growing healthy crops in harsh climates? I do.
Gates Foundation funding might be used to make it simple for desalination researchers without access to the SuperComputers at the National Labs to fund their modeling requirements. Two models for doing this would be 1.) for the scientists to go to the Gates Foundation directly or 2.)for an organization like WaterReuse.org to petition the Gates Foundation for a block grant to cover scientific and admistrative costs for a modeling program that would match scientists with modelers and their supercomputers.
Just a thought.
Often a research organization will have the right questions but limited time, budget or brain power with which to solve the problem. Wouldn’t it be nice to say “Ok we have this problem and we will pay this much for a solution. A number of web sites have grown up in the last couple years that bring together Research organizations and problem solvers like InnoCentive, YourEncore, & NineSigma. There’s a lot of seriously interesting ways this can be used to accelerate water desal research. Consider the article below in Wired Magazine.
http://www.wired.com/wired/archive/14.06/crowds.html?pg=4&topic=crowds&topic_set=
The Rise of Crowdsourcing
By Jeff Howe
3. The Tinkerer
The future of corporate R&D can be found above Kelly’s Auto Body on Shanty Bay Road in Barrie, Ontario. This is where Ed Melcarek, 57, keeps his “weekend crash pad,” a one-bedroom apartment littered with amplifiers, a guitar, electrical transducers, two desktop computers, a trumpet, half of a pontoon boat, and enough electric gizmos to stock a RadioShack. On most Saturdays, Melcarek comes in, pours himself a St. Remy, lights a Player cigarette, and attacks problems that have stumped some of the best corporate scientists at Fortune 100 companies.
Not everyone in the crowd wants to make silly videos. Some have the kind of scientific talent and expertise that corporate America is now finding a way to tap. In the process, forward-thinking companies are changing the face of R&D. Exit the white lab coats; enter Melcarek – one of over 90,000 “solvers” who make up the network of scientists on InnoCentive, the research world’s version of iStockphoto.
Pharmaceutical maker Eli Lilly funded InnoCentive’s launch in 2001 as a way to connect with brainpower outside the company – people who could help develop drugs and speed them to market. From the outset, InnoCentive threw open the doors to other firms eager to access the network’s trove of ad hoc experts. Companies like Boeing, DuPont, and Procter & Gamble now post their most ornery scientific problems on InnoCentive’s Web site; anyone on InnoCentive’s network can take a shot at cracking them.
The companies – or seekers, in InnoCentive parlance – pay solvers anywhere from $10,000 to $100,000 per solution. (They also pay InnoCentive a fee to participate.) Jill Panetta, InnoCentive’s chief scientific officer, says more than 30 percent of the problems posted on the site have been cracked, “which is 30 percent more than would have been solved using a traditional, in-house approach.”
The solvers are not who you might expect. Many are hobbyists working from their proverbial garage, like the University of Dallas undergrad who came up with a chemical to use inart restoration, or the Cary, North Carolina, patent lawyer who devised a novel way to mix large batches of chemical compounds.
This shouldn’t be surprising, notes Karim Lakhani, a lecturer in technology and innovation at MIT, who has studied InnoCentive. “The strength of a network like InnoCentive’s is exactly the diversity of intellectual background,” he says. Lakhani and his three coauthors surveyed 166 problems posted to InnoCentive from 26 different firms. “We actually found the odds of a solver’s success increased in fields in which they had no formal expertise,” Lakhani says. He has put his finger on a central tenet of network theory, what pioneering sociologist Mark Granovetter describes as “the strength of weak ties.” The most efficient networks are those that link to the broadest range of information, knowledge, and experience.
Which helps explain how Melcarek solved a problem that stumped the in-house researchers at Colgate-Palmolive. The giant packaged goods company needed a way to inject fluoride powder into a toothpaste tube without it dispersing into the surrounding air. Melcarek knew he had a solution by the time he’d finished reading the challenge: Impart an electric charge to the powder while grounding the tube. The positively charged fluoride particles would be attracted to the tube without any significant dispersion.
“It was really a very simple solution,” says Melcarek. Why hadn’t Colgate thought of it? “They’re probably test tube guys without any training in physics.” Melcarek earned $25,000 for his efforts. Paying Colgate-Palmolive’s R&D staff to produce the same solution could have cost several times that amount – if they even solved it at all. Melcarek says he was elated to win. “These are rocket-science challenges,” he says. “It really reinforced my confidence in what I can do.”
Melcarek, who favors thick sweaters and a floppy fishing hat, has charted an unconventional course through the sciences. He spent four years earning his master’s degree at the world-class particle accelerator in Vancouver, British Columbia, but decided against pursuing a PhD. “I had an offer from the private sector,” he says, then pauses. “I really needed the money.” A succession of “unsatisfying” engineering jobs followed, none of which fully exploited Melcarek’s scientific training or his need to tinker. “I’m not at my best in a 9-to-5 environment,” he says. Working sporadically, he has designed products like heating vents and industrial spray-painting robots. Not every quick and curious intellect can land a plum research post at a university or privately funded lab. Some must make HVAC systems.
For Melcarek, InnoCentive has been a ticket out of this scientific backwater. For the past three years, he has logged onto the network’s Web site a few times a week to look at new problems, called challenges. They are categorized as either chemistry or biology problems. Melcarek has formal training in neither discipline, but he quickly realized this didn’t hinder him when it came to chemistry. “I saw that a lot of the chemistry challenges could be solved using electromechanical processes I was familiar with from particle physics,” he says. “If I don’t know what to do after 30 minutes of brainstorming, I give up.” Besides the fluoride injection challenge, Melcarek also successfully came up with a method for purifying silicone-based solvents. That challenge paid $10,000. Other Melcarek solutions have been close runners-up, and he currently has two more up for consideration. “Not bad for a few weeks’ work,” he says with a chuckle.
It’s also not a bad deal for the companies that can turn to the crowd to help curb the rising cost of corporate research. “Everyone I talk to is facing a similar issue in regards to R&D,” says Larry Huston, Procter & Gamble’s vice president of innovation and knowledge. “Every year research budgets increase at a faster rate than sales. The current R&D model is broken.”
Huston has presided over a remarkable about-face at P&G, a company whose corporate culture was once so insular it became known as “the Kremlin on the Ohio.” By 2000, the company’s research costs were climbing, while sales remained flat. The stock price fell by more than half, and Huston led an effort to reinvent the way the company came up with new products. Rather than cut P&G’s sizable in-house R&D department (which currently employs 9,000 people), he decided to change the way they worked.
Feature: |
---|
The Rise of Crowdsourcing |
Plus: |
5 Rules of the New Labor Pool |
Look Who’s Crowdsourcing |
Seeing that the company’s most successful products were a result of collaboration between different divisions, Huston figured that even more cross-pollination would be a good thing. Meanwhile, P&G had set a goal of increasing the number of innovations acquired from outside its walls from 15 percent to 50 percent. Six years later, critical components of more than 35 percent of the company’s initiatives were generated outside P&G. As a result, Huston says, R&D productivity is up 60 percent, and the stock has returned to five-year highs. “It has changed how we define the organ-ization,” he says. “We have 9,000 people on our R&D staff and up to 1.5 million researchers working through our external networks. The line between the two is hard to draw.”P&G is one of InnoCentive’s earliest and best customers, but the company works with other crowdsourcing networks as well. YourEncore, for example, allows companies to find and hire retired scientists for one-off assignments. NineSigma is an online marketplace for innovations, matching seeker companies with solvers in a marketplace similar to InnoCentive. “People mistake this for outsourcing, which it most definitely is not,” Huston says. “Outsourcing is when I hire someone to perform a service and they do it and that’s the end of the relationship. That’s not much different from the way employment has worked throughout the ages. We’re talking about bringing people in from outside and involving them in this broadly creative, collaborative process. That’s a whole new paradigm.”
Nanotechnology simulations show what experiments miss
22nd June 2006
Last month Oak Ridge National Laboratory showed how they could evolve computer modeling/simulations in non-desalination- related materials research.
Last week Sandia Labs weighed in on the role of modeling/simulations for water desalination. The forum in this case was the Materials Research Society at its recent semiannual general meeting. I’ve posted the article below and here is a link to the press release. This PR bears comment.
The two most important points are that modeling/simulation results are now 1.)as good — or better– than lab observation & experimentation and 2.) San Dia Labs — like Oak Ridge and others– is automating the way they evaluate their modeling/simulationn results.
There are some stupendous kickers here that San Dia’s modelers mention.
“We need to sit back and put our mindset in a different mode,” he told his audience. “We’re all too busy doing [laboratory] research [instead of considering] how we can leverage resources to push our science to the next level.”
What did he mean? Perhaps he meant that models are sufficiently reliable — such that the pace of research can be greatly accelerated by dropping most physical experimentation until the models show a result scientists need to observe or replicate in their labs. The second consequence mentioned below is the relative ease by which models can be tweaked to get new results. ie instead of rebuilding a new model/simulation from scratch modelers can go to a database or databases or libraries of models/simulions and pull the closest model/simulation “off the shelf”. A modeler/simulator could tweak the model to fit the scientist’s requirements. If the results are good then the scientist can go to oberservation/creation.
There is also the matter of cost, says Fang: “With smart people developing numerical methods, models, and algorithms to use computers to study real cases, we find we can rerun calculations merely by changing computer parameters.
If the model doesn’t work then its back to the drawing board. In either case the new model is added to the library.
Of course the closest model/simulation may not be at San Dia Labs. It might be at Ames or Oak Ridge or where ever. And the model might not be directly related to desalination. Rather the model/ simulation might be something related to surfaces, catalysts or semipermiable membranes that are of interest to, say, hydrogen modelers
Linking modeling libraries of one or more institutions would be tough. Less tough for the desalination community might be to set up wheels and spoke relationship with the San Dia Modeling Center. Or maybe just a website could be created where scientists could post their modeling/simulation requirements–with some kind of billing/bidding system set up to connect them to San Dia. Or desalination scientists could post their model/simulation requirements at problem solvers like InnoCentive, YourEncore, & NineSigma.
Anyhow, here is the San Dia Press Release.
Nanotechnology simulations show what experiments miss
Nanotechnology simulations show what experiments miss
Rendering of Sandia simulations by Michael Chandross demonstrates significant transfer of material to the probe tip of an atomic force microscope.
Taking issue with the perception that computer models lack realism, a Sandia National Laboratories researcher told his audience that simulations of the nanoscale provide researchers more detailed results — not less — than experiments alone.
The invited talk by Eliot Fang was delivered to members of the Materials Research Society at its recent semiannual general meeting.
Fang derided the pejorative “garbage in, garbage out” description of computer modeling — the belief that inputs for computer simulations are so generic that outcomes fail to generate the unexpected details found only by actual experiment.
Fang not only denied this truism but reversed it. “There’s another, prettier world beyond what the SEM [scanning electron microscope] shows, and it’s called simulation,” he told his audience. “When you look through a microscope, you don’t see some things that modeling and simulation show.”
This change in the position of simulations in science — from weak sister to an ace card — is a natural outcome of improvements in computing, Fang says. “Fifteen years ago, the Cray YMP [supercomputer] was the crown jewel; it’s now equivalent to a PDA we have in our pocket.”
No one denies that experiments are as important as simulations — “equal partners, in fact,” says Julia Phillips, director of Sandia’s Physical, Chemical, and Nanosciences Center.
But the Labs’ current abilities to run simulations with thousands, millions, and even billions of atoms have led to insights that would otherwise not have occurred, Fang says.
For example, one simulation demonstrated that a tiny but significant amount of material had transferred onto the tip of an atomic force microscope (AFM) as it examined the surface of a microsystem.
“The probe tip changed something very, very tiny on the surface of the material,” says Fang. “It was almost not noticeable. But the property of the surface became very different.”
Laboratory observation couldn’t identify the cause of the property change, but computer simulations provided a reasonable explanation of the results.
As for predicting the reliability of materials that coat surfaces, Fang says, “We find that when we compare our simulation models with data from the experiments, we get a more complete understanding.”
Says Sandia Fellow and materials researcher Jeff Brinker, “We use simulations quite a bit in support of Sandia’s water purification program and the NIH Nano-Medicine Center program. In all these cases I’m working with theorists and modelers to guide the design of synthetic nanopores so as to develop transport behaviors approaching those of natural water or ion channels that exist in cell membranes.”
How is this understanding achieved?
Models computationally link a variety of size and time scales to create an experimental design.
“We use as much experimental information as possible to validate our methods,” says Alex Slepoy from Sandia’s Multiscale Computational Materials Methods. “The trick is picking a correct modeling strategy from our toolbox of methods.”
Asked whether simulations are merely more complex versions of what a graphic artist produces — a product of the imagination, in short, that cannot accurately produce new details — Slepoy provisionally entertains the idea: “A graphic artist has to make choices that are somewhat subconscious: what size objects to represent, how close-in to zoom, what details to include and exclude, and are there people out there who liked what he drew. So do we.
“But there the similarity ends. For us in computer simulations, the questions are more technical: Does the modeling strategy agree with experiments and is it consistent with established models? Does it have mathematical consistency?”
A further advance in accurate model development, he says, is that “now we’re developing automated methods to tell us whether we’ve satisfied [accuracy] requirements, rather than doing that by just manually looking at results. The method automatically tunes the model to satisfy the entire set of conditions as we know them.”
There is also the matter of cost, says Fang: “With smart people developing numerical methods, models, and algorithms to use computers to study real cases, we find we can rerun calculations merely by changing computer parameters. Thus the cost to push science forward is much cheaper than running experiments — particularly in nanoscience, where the realm is so small that experiments are difficult to perform, testing devices are not available, and data acquisition is a challenge.”
For all these reasons, he says, “This is why at CINT [the Sandia/Los Alamos Center for Integrated Nanotechnology, funded by DOE’s Office of Science], theory and simulation is one of its five thrusts. People view modeling and simulation as a critical component of nanoscience.”
“We need to sit back and put our mindset in a different mode,” he told his audience. “We’re all too busy doing [laboratory] research [instead of considering] how we can leverage resources to push our science to the next level.”
Modeling tools include: meso-scale (an intermediate resolution capability functioning between the atomic and macro scales), classical atomistics (classical force-field theory), Density Functional Theory (a one-electron approximation of quantum theory, where an electron interacts with atoms but not with another electron), and the full quantum model (electrons interacting with other electrons and four or five ions).
Source: Sandia National Laboratories
This news is brought to you by PhysOrg.com
The pace of water desalination R&D research will accelerate.
22nd June 2006
The pace of Desalination R&D is going to accelerate over the next several years. Professionals will want to check in regularly to find out the latest.