SITE SEARCH

Google

Wednesday, December 19, 2007

MIT to lead ambitious lunar mission

Twin satellites will study the moon's gravitational pull
David Chandler, MIT News OfficeDecember 14, 2007
MIT will lead a $375 million mission to map the moon's interior and reconstruct its thermal history, NASA announced this week.
The Gravity Recovery and Interior Laboratory (GRAIL) mission will be led by MIT professor Maria Zuber and will be launched in 2011. It will put two separate satellites into orbit around the moon to precisely map variations in the moon's gravitational pull. These changes will reveal differences in density of the moon's crust and mantle, and can be used to answer fundamental questions about the moon's internal structure and its history of collisions with asteroids.
The detailed information about lunar gravity will also significantly facilitate any future manned or unmanned missions to land on the moon. Such data will be used to program the descent to the surface to avoid a crash landing and will also help target desirable landing sites. Moreover, the mission's novel technology could eventually be used to explore other interesting worlds such as Mars.
"After the three-month mission is completed, we will know the lunar gravitational field better than we know Earth's," says Zuber, who is head of MIT's Department of Earth, Atmospheric and Planetary Sciences and the E.A. Griswold Professor of Geophysics. She will be the principal investigator for the GRAIL mission.
Former astronaut Sally Ride, the first U.S. woman in space, will lead the project's educational outreach phase, which will include five live MoonKam cameras on each satellite that will be targeted by young students--especially middle-school girls--in their classrooms to get close-up still and video views of the moon's surface.
So far, even such fundamental questions as whether or not the moon has a separate, differentiated core, as Earth does, are unknown, Zuber says. In addition to answering that question, the new mission should reveal details about lunar history, including the relative timing and effects of the myriads of huge impacts that created the craters and basins seen on the surface today. The moon, with its airless, un-eroded surface, serves as a kind of Rosetta Stone for understanding the history of all the solar system's inner planets--Mercury, Venus, Earth and Mars--so the mission should also help to unlock secrets of the evolution of all these planets.
"The moon has the best-preserved record of the solar system's early history," Zuber says, while on other planets much of that record has been lost through erosion and other surface changes.
The technology used in the mission is a direct spinoff from the highly successful Gravity Recovery and Climate Experiment (GRACE) mission, which has been mapping Earth's gravitational field since 2002. Using that technology made this a "low risk" mission for NASA because the necessary instruments had already been developed and tested.
As with that mission, GRAIL measurements of the gravitational field will come from very precise monitoring of changes in the distance between the two satellites. The resulting measurements will map the moon's gravitational field up to 1,000 times more accurately than any previous mapping.
The main new technology needed to make GRAIL possible was a way to calibrate the timing of the satellites accurately. The Earth-orbiting GRACE satellites use the GPS satellite navigation system, but there is no such system at the moon. Instead, the team adapted a technique that involves precise monitoring of radio signals originally designed for a different purpose for another planetary mission in development, named Juno.
The same technology could be applied to future missions to map the gravitational fields of other interesting worlds such as Mars, where it could reveal the exchange of carbon dioxide between the polar caps and atmosphere or the movement of flowing subsurface water, Zuber says. "We could learn amazing things" from such follow-up missions, she says. "Since we solved the GPS problem for the moon, we could propose this with little modification for other planets."
NASA selected the MIT-led mission from among two dozen original proposals. NASA Associate Administrator for Science Alan Stern noted that "GRAIL's revolutionary capabilities stood out in this Discovery mission competition owing to its unsurpassed combination of high scientific value and low technical and programmatic risk."
The GRAIL satellites will be built and operated by Lockheed Martin Space Systems of Denver, Colo. NASA's Jet Propulsion Laboratory (JPL) in Pasadena, Calif., will handle project management and development of the communications and navigation systems.
The mission's science team also includes David E. Smith of NASA Goddard Space Flight Center (GSFC), who will be the deputy principal investigator, and other researchers from JPL, GSFC, the Carnegie Institution of Washington, the University of Arizona, the University of Paris and the Southwest Research Institute.

Thursday, December 13, 2007

Study finds that linked wind farms can result in reliable power


Wind power, long considered to be as fickle as wind itself, can be groomed to become a steady, dependable source of electricity and delivered at a lower cost than at present, according to scientists at Stanford University.
The key is connecting wind farms throughout a given geographic area with transmission lines, thus combining the electric outputs of the farms into one powerful energy source. The findings are published in the November issue of the American Meteorological Society's Journal of Applied Meteorology and Climatology.
Wind is the world's fastest growing electric energy source, according to the study's authors, Cristina Archer and Mark Jacobson, who will present their findings Dec. 13 at the annual meeting of the American Geophysical Union in San Francisco. Their talk is titled "Supplying Reliable Electricity and Reducing Transmission Requirements by Interconnecting Wind Farms."
However, because wind is intermittent, it is not used to supply baseload electric power today. Baseload power is the amount of steady and reliable electric power that is constantly being produced, typically by power plants, regardless of electricity demand. But interconnecting wind farms with a transmission grid reduces the power swings caused by wind variability and makes a significant portion of it just as consistent a power source as a coal power plant.
"This study implies that, if interconnected wind is used on a large scale, a third or more of its energy can be used for reliable electric power, and the remaining intermittent portion can be used for transportation, allowing wind to solve energy, climate and air pollution problems simultaneously," said Archer, the study's lead author and a consulting assistant professor in Stanford's Department of Civil and Environmental Engineering and research associate in the Department of Global Ecology at the Carnegie Institution.
It's a bit like having a bunch of hamsters generating your power, each in a separate cage with a treadmill. At any given time, some hamsters will be sleeping or eating and some will be running on their treadmill. If you have only one hamster, the treadmill is either turning or it isn't, so the power's either on or off. With two hamsters, the odds are better that one will be on a treadmill at any given point in time, and your chances of running, say, your blender, go up. Get enough hamsters together, and the odds are pretty good that at least a few will always be on the treadmill, cranking out the kilowatts.
The combined output of all the hamsters will vary, depending on how many are on treadmills at any one time, but there will be a certain level of power that is always being generated, even as different hamsters hop on or off their individual treadmills. That's the reliable baseload power.
The connected wind farms would operate the same way.
"The idea is that, while wind speed could be calm at a given location, it could be gusty at others. By linking these locations together we can smooth out the differences and substantially improve the overall performance," Archer said.
As one might expect, not all locations make sense for wind farms. Only locations with strong winds are economically competitive. In their study, Archer and Jacobson, a professor of civil and environmental engineering at Stanford, evaluated 19 sites in the Midwestern United States with annual average wind speeds greater than 6.9 meters per second at a height of 80 meters above ground, the hub height of modern wind turbines. Modern turbines are 80 to 100 meters high, approximately the height of a 30-story building, and their blades are 70 meters long or more.
The researchers used hourly wind data, collected and quality-controlled by the National Weather Service, for the entire year of 2000 from the 19 sites. They found that an average of 33 percent and a maximum of 47 percent of yearly-averaged wind power from interconnected farms can be used as reliable baseload electric power. These percentages would hold true for any array of 10 or more wind farms, provided it met the minimum wind speed and turbine height criteria used in the study.
Another benefit of connecting multiple wind farms is reducing the total distance that all the power has to travel from the multiple points of origin to the destination point. Interconnecting multiple wind farms to a common point and then connecting that point to a far-away city reduces the cost of transmission.
It's the same as having lots of streams and creeks join together to form a river that flows out to sea, rather than having each creek flow all the way to the coast by carving out its own little channel.
Another type of cost saving also results when the power combines to flow in a single transmission line. Explains Archer: Suppose a power company wanted to bring power from several independent farms—each with a maximum capacity of, say, 1,500 kilowatts (kW)—from the Midwest to California. Each farm would need a short transmission line of 1,500 kW brought to a common point in the Midwest. Then a larger transmission line would be needed between the common point and California—typically with a total capacity of 1,500 kW multiplied by the number of independent farms connected.
However, with geographically dispersed farms, it is unlikely that they would simultaneously be experiencing strong enough winds to each produce their 1,500 kW maximum output at the same time. Thus, the capacity of the long-distance transmission line could be reduced significantly with only a small loss in overall delivered power.
"Due to the high cost of long-distance transmission, a 20 percent reduction in transmission capacity with little delivered-power loss would notably reduce the cost of wind energy," added Archer, who calculated the decrease in delivered power to be only about 1.6 percent.
With only one farm, a 20 percent reduction in long-distance transmission capacity would decrease delivered power by 9.8 percent—not a 20 percent reduction, because the farm is not producing its maximum possible output all the time.
Archer said that if the United States and other countries each started to organize the siting and interconnection of new wind farms based on a master plan, the power supply could be smoothed out and transmission requirements could be reduced, decreasing the cost of wind energy. This could result in the large-scale market penetration of wind energy—already the most inexpensive clean renewable electric power source—which could contribute significantly to an eventual solution to global warming, as well as reducing deaths from urban air pollution.
A wind power feasibility study of potential sites along the California coast by Mike Dvorak, a Stanford doctoral student in civil and environmental engineering who is working with Jacobson and Archer, also is being presented during an afternoon poster session at the meeting.

Wednesday, December 12, 2007

MIT Ranked #2 Among US Architecture Schools

A Sign of the School’s Ongoing Revitalization
According to Architect Magazine, MIT has been ranked #2 among graduate schools of architecture in the United States, reflecting the school's significant revitalization of its design programs in recent years. In 2007 and 2005, MIT was ranked #4; in 2006, #8; and in 2004, #5.
Tremendous changes in the school have been spearheaded in the past few years by Dean Adèle Naudé Santos, appointed in 2004, and architecture department head Yung Ho Chang, appointed in 2005. Under their leadership, a number of new faculty have been hired - including, most recently, tenured professors Rahul Mehrotra and Nader Tehrani, two very highly-regarded practitioners and educators - and in the spring and fall of 2008 the department will welcome still more high-profile architects/scholars to the faculty.
Meanwhile, a new curriculum for the Master of Architecture program is being implemented this fall and the architecture department's other degree programs are also being reexamined and refined. Those programs include the Bachelor of Science and Bachelor of Science in Art and Design; Masters of Science in Architecture Studies, in Building Technology and in Visual Studies; and PhDs in Building Technology, in Design and Computation and in the History and Theory of Art and Architecture. Dual degrees are also offered.
As the school moves further into the future, it will continue to encourage interdisciplinary research and education, a trademark of the school, and to emphasize diversity in the faculty and student body. Chang also aims to integrate further the discipline groups within the department - building technology; computation; design (including urbanism); history, theory and criticism; and visual arts.
As part of a school-wide effort, new studio space has been created and most studios are being consolidated on the main campus. Construction has also begun on a major new facility for the school designed by Fumihiko Maki, winner of the Pritzker Prize in 1993. Adjacent to and part of the school's legendary Media Lab, the new building will host new design labs, the architecture department's visual arts program and its Center for Advanced Visual Studies. The space will feature an open, atelier-style architecture designed to foster collaboration among all the school's divisions as well as with other divisions of MIT.
The ranking reported in Architect was the result of a poll that surveyed 130 offices of architectural firms, 46 deans of architecture schools and 740 students. The participating firms included many of the country's leaders who, collectively, employ more than 100,000 people. The survey was conducted by The Greenway Group for the Design Futures Council and the journal DesignIntelligence.
Trailing MIT in the survey were Columbia, Cornell, Washington University in St. Louis, Virginia Polytechnic Institute and State University, University of Cincinnati, University of Michigan, UC/Berkeley, Clemson University, Rice and the University of Texas at Austin. Runners-up were Princeton, UVA, Yale, Kansas State and Syracuse University. The #1 slot went to a school up the street from MIT that shall remain unnamed.
New Graduates Bring New Ideas About Sustainability
According to Architect, 'Increasingly, students are more knowledgeable than more experienced practitioners about green building and technologies such as BIM. This is bringing about a phenomenon known as 'up-mentoring', in which interns and architects in their 20s and 30s have more-valuable roles in professional practice than ever before, helping baby boomer and even Generation X colleagues keep pace with technology. Firms using recent graduates solely for AutoCAD production are sorely underutilizing their talent. When our survey asked practitioners if their firms got an infusion of new ideas about sustainability from recent hires, 57 percent said yes, and that response is expected to increase.'

Service oriented architecture (SOA)

Make your existing infrastructure do more for your business, with service oriented architecture (SOA). Using SOA, IT applications that support business processes are restructured into reusable building blocks or 'services', which can be combined, configured, and reused to rapidly meet changing business needs in innovative, cost-effective ways.

Service Oriented Architecture (SOA) is an architectural style that guides all aspects of creating and using business processes, packaged as services, throughout their lifecycle, as well as defining and provisioning the IT infrastructure that allows different applications to exchange data and participate in business processes regardless of the operating systems or programming languages underlying those applications.SOA represents a model in which functionality is decomposed into small, distinct units (services), which can be distributed over a network and can be combined together and reused to create business applications.

Monday, September 10, 2007

GWS Web server

GWS is the server which is used by google. It's a server runs in linux,however there is another server available as Geomatic WebServer (GWS) for Windows.I think the short form GWS will meant for Google Web Server.

Friday, May 18, 2007

What is Artificial Intelligence (AI)?

Computational models of human behavior?
Programs that behave (externally) like humans

One thing it could be is "Making computational models of human behavior". Since we believe that humans are intelligent, therefore models of intelligent behavior must be AI. There's a great paper by Turing who really set up this idea of AI as making models of human behavior (link). In this way of thinking of AI, how would you proceed as an AI scientist? One way, which would be a kind of cognitive science, is to do experiments on humans, see how they behave in certain situations and see if you could make computers behave in that same way. Imagine that you wanted to make a program that played poker. Instead of making the best possible poker-playing program, you would make one that played poker like people do.






Usefull book:

Winston, Patrick H. Artificial Intelligence. 3rd ed. Reading, MA: Addison-Wellsley, 1992. ISBN: 0201533774.

Thursday, May 17, 2007

Important Books in AI


1. Artificial Intelligence, 3rd edition by Patrick H. Winston

2.Artificial Intelligence: Modern Approach by Stuart J. Russell and Peter Norvig

3.Neural Networks in Finance and Investing, Revised Second Edition

4.
Machine Learning, by Tom Mitchell