Showing posts with label Techonology Concepts. Show all posts
Showing posts with label Techonology Concepts. Show all posts

Friday 21 October 2011

Amazing New Motion Capture Tech Makes Games Look Like Films


Check out this amazing new video on YouTube.  It's a short clip on the making of the upcoming video game, L.A. Noire.  Specifically, it focuses on a new motion capture technology called MotionScan, which creates the most lifelike scenes yet.



Brendan McNamara, lead developer  for L A Noire, has said that “We’re definitely blurring the lines now. I want this game to be the flashpoint where people start to think of games and film as being on the same level, because I’m confident they already are.”  It's hard to argue after seeing the footage.  The company behind MotionScan is depth Analysis, based in Sydney, Australia. Depth Analysis announced the innovation earlier this year, and gave a few hints about how it works.

"MotionScan uses 32 High Definition cameras to capture true-to-life three-dimensional performances at up to 30 frames per second," the company revealed. "Capable of capturing up to 50 minutes of final footage and processing up to 20 minutes of facial animation automatically per day, the technology revolutionizes traditional motion-capture and post-production animation. MotionScan records every emotional detail, mannerism, and facial nuance accurately frame by frame as 3D models.  No markers or phosphorescent paint needs to be applied to the actors at the time of recording, and no manpower is required to clean up data and animate the finer details by hand after the shoot. For directors and cinematographers, an additional advantage of MotionScan is the ability to view an actor’s performance from any angle and re-light in any way from one take without the need for multiple camera and lighting setups that quickly drain production time and budgets."

This comes at a momentous time for motion-capture innovation.  Microsoft, of course, recently rolled out the Kinect motion-sensing camera for the Xbox 360.  The Kinect is sort of the DIY version of MotionScan, letting gamers transport themselves into the action.  It'll be interesting to see how this increased realism impacts game development in the coming year.  Perhaps most significantly, it may lead more A-List Hollywood actors into games.   After all, as one can see by the LA Noire footage (which features an actor from the hit show Mad Men) there's more "acting" that can actually come through now given the mocap precision.

LA Noire, which is made by Rockstar Games, will likely be a breakthrough title, ushering in a new era of cinematic game play.  With so much attention focused now on smaller, social games like Farmville, cinematic epics are primed for reinvention.   I have no doubt that while gamers may be spending more time on iPhones, there's always an appetite for big brash immersive epics like LA Noire. 

Monday 17 October 2011

Skinput: Appropriating the Body as an Input Surface


To download the research article on Skinput  Click Here

We present Skinput, a technology that appropriates the human body for acoustic transmission, allowing the skin to be used as an input surface. In particular, we resolve the location of finger taps on the arm and hand by analyzing mechanical vibrations that propagate through the body. We collect these signals using a novel array of sensors worn as an armband. This approach provides an always available, naturally portable, and on-body finger input system. We assess the capabilities, accuracy and limitations of our technique through a two-part, twenty-participant user study. To further illustrate the utility of our approach, we conclude with several proof-of-concept applications we developed.

Project Team

Chris Harrison

Dan Morris

Desney Tan

In Chris Harrison's ideal world, mobile phones would be the size of matchbooks. They'd have full-size keyboards. They'd browse the Web. They'd play videos.


And, most importantly, you'd never have to touch them.

Sound like too much to ask? Maybe not.

Harrison, a graduate student at Carnegie Mellon University and a former intern at Microsoft Research, has developed a working prototype of a system called Skinput that does just that, essentially by turning a person's hand and forearm into a keyboard and screen.

"People don't love the iPhone keyboard. They use them. But they don't love them," Harrison said in a interview at the recent Computer-Human Interaction conference. "If you could make the iPhone keyboard as big as an arm -- that would be huge."

Using Skinput, a person could tap their thumb and middle finger together to answer a call; touch their forearm to go to the next track on a music player; or flick the center of their palm to select a menu item.

All of these sign-language-like movements, which are customizable, would control a gadget in a person's pocket through a Bluetooth connection.

When fitted with a pico-projector, the Skinput system could display an image of a digital keyboard on a person's forearm. So, using Skinput, someone could send text messages by tapping his or her arm in certain places -- without pulling the phone out of a pocket or purse.
"You could pretty much do a lot of what you do on your iPhone," said Harrison, who says Skinput "is [like having] your iPhone on your palm."

The system, which has been under development for eight months, won't be commercially available for two to seven years, said Dan Morris, a Microsoft researcher who is working with Harrison on Skinput.

Before that can happen, Skinput's sensors need to get more accurate, he said. In a 13-person trial in Seattle, Washington, Skinput was found to be 96 percent accurate. But that test only used five buttons. The system would have to improve for people to make use of a full keyboard, which would be the "holy grail," Morris said.

"The accuracy is good, but it's not quite consumer-level yet," he said.

Skinput is one of a number of prototypes, ideas and near-products aiming to make computing more natural.

These devices seek to move beyond the mouse and physical keyboard, letting people communicate with their gadgets by gesturing, using sign language or, in the case of Skinput, tapping on their hands, fingers and forearms.

The arm as an instrument

Understanding how Skinput works makes it seem all the more futuristic.
The system turns a person's arm and hand into a wiggling, pulsating instrument, full of vibrations that can be picked up and translated.

Skinput users wear an armband -- the prototype version is made of an elbow brace -- that's lined with 10 sensors. These sensors look like tiny diving boards with dumbbells on one end, and they pick up inaudible sounds that range in frequency from 25 to 78 hertz.

When a Skinput user taps a thumb and middle finger together, the impact sends ripples down the skin and through the bones in the person's arm. "They sort of start resonating -- like guitar strings," Harrison said. The diving-board receivers read the sound waves to figure out what gesture the person made, and then relay that information to a phone.

Skinput can tell whether a person tapped a middle finger or an index finger, because the two moves sound slightly different to the springy receivers.

The system takes one or two minutes to learn the sounds of a particular person's arm, Morris said, and then it can be used however the user likes.

Trial and error

When they started working on Skinput, Morris and Harrison weren't sure if it would be possible to turn the human arm into a virtual keypad. The pair tried clipping sensors to the ends of peoples' fingers and other strange configurations that made users feel like cyborgs.
"We spent a lot of nights in the lab tapping on our arms and wondering if this would ever happen," Harrison said.

But the most profound achievement of Skinput is proving that the human body can be used as a sensor, he said.

Morris believes Skinput will make computing accessible to people in a way that never would have been possible before.

With Skinput, "literally, computing is always available," he said.

A person might walk toward their home, Harrison said, tap their palm to unlock the door and then tap some virtual buttons on their arms to turn on the TV and start flipping through channels.

"It's almost like magic," he said.

To download the research article on Skinput  Click Here

Video


Friday 14 October 2011

Top 10 Futuristic Concept Laptops

To have the gadgets with top specs is mostly people’s choice but to have a laptop with the most unusual features is the craze. This list of futuristic cool concept laptops has the ideal and unrealistic designs that can totally blow your mind. The list we have here have the laptops that are highly functional, reliable and fantastically designed. Let us see


10. Anna Lopez รข€™ concept is especially designed for usage in cars or the people who are dedicated to mobile work. It is really lightweight with full screen keyboard and transparent display.

anna lopez concept Top 10 Futuristic Concept Laptops

9. This is designed by HP and the pictures describe its whole design for sure.

hp laptop concept 1 Top 10 Futuristic Concept Laptops

8. V12 design by Canova, which is a dual touch screen laptop with 2 display screens. It is designed especially for graphic editors.

canova dual touch screen laptop 2 Top 10 Futuristic Concept Laptops

7. CEATEC by Fujitsu has a touch keypad with backlight display and 5.1 surround sound system with music keys. This 20 inches gadget is also called DJ laptop.

fujitsu dj laptop Top 10 Futuristic Concept Laptops

6. Flexi PDA Concept Laptops  comes with a flexible display and it is also waterproof.

flexi pda concept 2 Top 10 Futuristic Concept Laptops

5. This Frog Design laptop is called Gelfrog and is as light as a newspaper. It also projects amatching pattern on its surface to be scanned by people.

gelfrog concept 2 Top 10 Futuristic Concept Laptops

4. This PC laptop concept by Fujitsu is a papers display type which is flexible and lightweight and looks like an office folder.

fujitsu fab pc concept Top 10 Futuristic Concept Laptops

3. Itel Ziba Design Concept laptop, This 17.7 inches laptop which weighs 2.25 pounds is totally awesome!

intel mobile metro notebook ziba design laptop concept Top 10 Futuristic Concept Laptops

2. Compenion prototype has OLED touch screen display and has a slider design. It also supports multi touch screen

compenion concept 2 Top 10 Futuristic Concept Laptops

1. This one is a Vaio Zoo, Holographic Concept notebook and is completely transparent and has touch screen keyboard. It is thin, lightweight and totally cool.

vaio zoom holographic concept 2 Top 10 Futuristic Concept Laptops

'Robot Biologist' Solves Complex Problem from Scratch


First it was chess. Then it was Jeopardy. Now computers are at it again, but this time they are trying to automate the scientific process itself.


An interdisciplinary team of scientists at Vanderbilt University, Cornell University and CFD Research Corporation, Inc., has taken a major step toward this goal by demonstrating that a computer can analyze raw experimental data from a biological system and derive the basic mathematical equations that describe the way the system operates. According to the researchers, it is one of the most complex scientific modeling problems that a computer has solved completely from scratch.
The paper that describes this accomplishment is published in the October issue of the journal Physical Biology and is currently available online.

The work was a collaboration between John P. Wikswo, the Gordon A. Cain University Professor at Vanderbilt, Michael Schmidt and Hod Lipson at the Creative Machines Lab at Cornell University and Jerry Jenkins and Ravishankar Vallabhajosyula at CFDRC in Huntsville, Ala.

The "brains" of the system, which Wikswo has christened the Automated Biology Explorer (ABE), is a unique piece of software called Eureqa developed at Cornell and released in 2009. Schmidt and Lipson originally created Eureqa to design robots without going through the normal trial and error stage that is both slow and expensive. After it succeeded, they realized it could also be applied to solving science problems.

One of Eureqa's initial achievements was identifying the basic laws of motion by analyzing the motion of a double pendulum. What took Sir Isaac Newton years to discover, Eureqa did in a few hours when running on a personal computer.

In 2006, Wikswo heard Lipson lecture about his research. "I had a 'eureka moment' of my own when I realized the system Hod had developed could be used to solve biological problems and even control them," Wikswo said. So he started talking to Lipson immediately after the lecture and they began a collaboration to adapt Eureqa to analyze biological problems.

"Biology is the area where the gap between theory and data is growing the most rapidly," said Lipson. "So it is the area in greatest need of automation."

Software passes test

The biological system that the researchers used to test ABE is glycolysis, the primary process that produces energy in a living cell. Specifically, they focused on the manner in which yeast cells control fluctuations in the chemical compounds produced by the process.

The researchers chose this specific system, called glycolytic oscillations, to perform a virtual test of the software because it is one of the most extensively studied biological control systems. Jenkins and Vallabhajosyula used one of the process' detailed mathematical models to generate a data set corresponding to the measurements a scientist would make under various conditions. To increase the realism of the test, the researchers salted the data with a 10 percent random error. When they fed the data into Eureqa, it derived a series of equations that were nearly identical to the known equations.

"What's really amazing is that it produced these equations a priori," said Vallabhajosyula. "The only thing the software knew in advance was addition, subtraction, multiplication and division."

Beyond Adam

The ability to generate mathematical equations from scratch is what sets ABE apart from Adam, the robot scientist developed by Ross King and his colleagues at the University of Wales at Aberystwyth. Adam runs yeast genetics experiments and made international headlines two years ago by making a novel scientific discovery without direct human input. King fed Adam with a model of yeast metabolism and a database of genes and proteins involved in metabolism in other species. He also linked the computer to a remote-controlled genetics laboratory. This allowed the computer to generate hypotheses, then design and conduct actual experiments to test them.

"It's a classic paper," Wikswo said.
In order to give ABE the ability to run experiments like Adam, Wikswo's group is currently developing "laboratory-on-a-chip" technology that can be controlled by Eureqa. This will allow ABE to design and perform a wide variety of basic biology experiments. Their initial effort is focused on developing a microfluidics device that can test cell metabolism.

"Generally, the way that scientists design experiments is to vary one factor at a time while keeping the other factors constant, but, in many cases, the most effective way to test a biological system may be to tweak a large number of different factors at the same time and see what happens. ABE will let us do that," Wikswo said.

The project was funded by grants from the National Science Foundation, National Institute on Drug Abuse, the Defense Threat Reduction Agency and the National Academies Keck Futures Initiative.


Posted by : Faran Ali

Sunday 9 October 2011

Nokia's nanotech Morph goes on display, signals melting devices in our future

Why is Nokia always trying to outdo everyone with its fancy-schmancy concepts and designs? Why can't they just get in line and keep it simple? We may never know the answer to those questions, but what we do know is that the company is presenting a new concept device called the Morph that would be right at home... in the year 3000. The unit is included in the MoMA's "Design and the Elastic Mind" exhibition catalog, and boasts the ability to stretch and flex to almost any shape a user could think of. The nanotechnology-based device would deliver transparent electronics, self-cleaning surfaces, and the malleability to transform into any number of configurations. Of course, the actual technology required to put this together is years or even decades away, though Nokia expects to see some of these innovations making their way into high-end products within seven years.




The Morph Concept

Launched alongside The Museum of Modern Art “Design and The Elastic Mind” exhibition, the Morph concept device is a bridge between highly advanced technologies and their potential benefits to end-users. This device concept showcases some revolutionary leaps being explored by Nokia Research Center (NRC) in collaboration with the Cambridge Nanoscience Centre (United Kingdom) – nanoscale technologies that will potentially create a world of radically different devices that open up an entirely new spectrum of possibilities.
Morph concept technologies might create fantastic opportunities for mobile devices:
  • Newly-enabled flexible and transparent materials blend more seamlessly with the way we live
  • Devices become self-cleaning and self-preserving
  • Transparent electronics offering an entirely new aesthetic dimension
  • Built-in solar absorption might charge a device, whilst batteries become smaller, longer lasting and faster to charge
  • Integrated sensors might allow us to learn more about the environment around us, empowering us to make better choices
In addition to the advances above, the integrated electronics shown in the Morph concept could cost less and include more functionality in a much smaller space, even as interfaces are simplified and usability is enhanced. All of these new capabilities will unleash new applications and services that will allow us to communicate and interact in unprecedented ways.

Flexible & Changing Design
Nanotechnology enables materials and components that are flexible, stretchable, transparent and remarkably strong. Fibril proteins are woven into a three dimensional mesh that reinforces thin elastic structures. Using the same principle behind spider silk, this elasticity enables the device to literally change shapes and configure itself to adapt to the task at hand.
A folded design would fit easily in a pocket and could lend itself ergonomically to being used as a traditional handset. An unfolded larger design could display more detailed information, and incorporate input devices such as keyboards and touch pads.
Even integrated electronics, from interconnects to sensors, could share these flexible properties. Further, utilization of biodegradable materials might make production and recycling of devices easier and ecologically friendly.

Self-Cleaning
Nanotechnology also can be leveraged to create self-cleaning surfaces on mobile devices, ultimately reducing corrosion, wear and improving longevity. Nanostructured surfaces, such as “Nanoflowers” naturally repel water, dirt, and even fingerprints utilizing effects also seen in natural systems.

Advanced Power Sources
Nanotechnology holds out the possibility that the surface of a device will become a natural source of energy via a covering of “Nanograss” structures that harvest solar power. At the same time new high energy density storage materials allow batteries to become smaller and thinner, while also quicker to recharge and able to endure more charging cycles.

Sensing The Environment
Nanosensors would empower users to examine the environment around them in completely new ways, from analyzing air pollution, to gaining insight into bio-chemical traces and processes. New capabilities might be as complex as helping us monitor evolving conditions in the quality of our surroundings, or as simple as knowing if the fruit we are about to enjoy should be washed before we eat it. Our ability to tune into our environment in these ways can help us make key decisions that guide our daily actions and ultimately can enhance our health.

Wednesday 5 October 2011

Is the 'Avatar' concept really possible?

Now the highest-grossing film ever, "Avatar," has captivated millions of viewers with its picturesque scenery, extraterrestrial battles, and nature-loving, blue-skinned aliens.
The premise of the film is that humans can enter the world of these 10-foot aliens, called the Na'vi, by way of half-human, half-Na'vi hybrids. A high-tech interfacing mechanism allows a human to remain inert while controlling one of these avatar hybrids just by thinking.
Not only does the human manipulate the avatar's movements and speech, but he or she also experiences life -- every sensation, feeling and emotion -- through the eyes of the hybrid, as if consciousness were transferred.

Scientists say we are many decades, even centuries, away from making this kind of sophisticated interaction possible, if it can be done at all. But the fundamentals of components required to create this complicated system of mind-controlled avatars are already in the works, and have useful applications in medicine.

"We're starting to understand the basic building blocks, but the biggest challenges will be emotion and thought -- how to make another organism think what you think, to feel what you feel -- because those networks are much more difficult to sort out," said Dr. Brian Litt, associate professor of neurology and bioengineering at the University of Pennsylvania.
Moving remote objects by thinking

Although nothing as complex as manipulating a creature through thought has been done, scientists working on allowing handicapped people to move prosthetic limbs with their minds are making headway. This idea actually played a role in the movie: Protagonist Jake Sully was in a wheelchair in his human body, but could walk, run and jump as his avatar.

One demonstration has been shown by Miguel Nicolelis, a neuroscientist at Duke University, who is working on robotic leg braces. In 2008, his group got a monkey in North Carolina to mentally control the walking patterns of a robot in Japan.

This was done by implanting electrodes in the brains of two rhesus monkeys. The electrodes recorded how cells in the brain's motor and sensory cortex responded to walking on the treadmill at various speeds. The monkeys' legs also had sensors to record walking patterns.
Researchers used all this information to predict the exact speed of movement and stride length of the legs, and uploaded that information to a robot in Japan, getting the robot to move in synch with a monkey thousands of miles away in real time. Even when the treadmill was turned off, a monkey continued making the robot walk just by thinking for a few minutes.

Another arena is one of virtual reality: controlling an avatar in a video game with your mind. Jaime Pineda, cognitive neuroscientist at the University of California, San Diego, is working with a brain-computer interface that allows participants to move a car around a racetrack, fly a plane and do other virtual tasks on a screen, simply by thinking. The mental training for this takes about four to six hours, he said.

"It is based on the motor parts of your brain. That's what we're recording from, and so if you think about moving, it's actually as if you are actually moving," he said.
Apart from the entertainment value, Pineda sees this as a future therapy for autistic children. The theory is that because people with autism have less conductivity between various parts of the brain, participating in mind-controlled video games may normalize those circuits. Results from his lab show improvement in social interaction and other behaviors after 10 or 20 weeks of playing the game in the lab.

Uploading information

For an "Avatar"-style brain-computer interface, an enormous amount of data would have to be transferred from the person to the avatar extremely quickly, Litt said.
Here's what's possible now: Scientists such as Gerwin Schalk at the New York State Department of Health's Wadsworth Center have harnessed the brain's electrical impulses to have people mentally type their thoughts using electrodes on the surface of the head. The rate is typically seven words per minute. In one epileptic patient who had electrodes already on the brain for clinical purposes, the record was 20 words per minute.

This kind of technology is useful for people who are paralyzed and cannot communicate, Schalk said. His research group also works on using electrodes to extract specific information from the brain such as people's actual actions, imagined actions and intended actions -- even how they move individual fingers.

The research "has changed that widespread assumption that it's not really impossible to acquire detailed information from the brain in humans," Schalk said.
Understanding brain networks.

In order to construct a high-tech interface that would allow two-way communication between an avatar and a person, there must be a better understanding of the brain itself, Litt said.
Litt's group studies the brain's networks involved in epilepsy. The researchers are looking at the abnormal circuits to figure out the basic units of the brain that generate seizures. They have licensed intellectual property for a device that can improve epilepsy by stimulating specific brain regions, potentially eliminating the need for surgery.

"I see tremendous possibilities for more and more ability to unlock these networks," he said.

Transferal of sensations and emotions

The area of this "avatar science" that will be most difficult to sort out is being able to feel and think as the avatar, Litt said.

Today there are auditory prostheses called cochlear implants that encode signals that allow people to hear who could not otherwise, as well as rudimentary visual aids. It is also possible to stick a pin in a particular part of someone's brain and induce sensations of various temperatures, pressures and even pain levels, just by stimulating certain neural circuits.

"But there's a far cry from doing that to being able to make somebody feel an emotion or see something," Litt said.

The technical hurdles, including transferring huge amounts of information extremely rapidly and building devices to both extract signals from the brain and inject signals into the brain, are significant, he said.

Clearly, people aren't going to be able to climb into personal pods and use their brains to remotely control 10-foot-tall creatures any time soon. Andrew Schwartz, neuroscientist at the University of Pittsburgh, added in an e-mail that it is not even clear what "consciousness" is. There's no rigorous definition, and how it looks in the brain is unknown.

"It's a wonderful movie, but it shouldn't be taken as anything but fantasy," he said.

A guide to the best HAARP conspiracy theories

A military-funded project called the High-frequency Active Auroral Research Program (HAARP), located on remote tundra in Alaska, jumps off the horizon just past mile marker 11 on the Glenn Highway. The program's main facility sits behind a barbed wire fence that stretches as far as the eye can see. What grabs the imagination of most, though, are the couple hundred oversized antennas, described by legions of journalists and conspiracy theorists, including Noah Schactman of Wired: "180 silver poles rising from the ground, each a foot thick, 72 feet tall, and spaced precisely 80 feet apart ... Geometric patterns form and reform in every direction, Athenian in their symmetry. It looks like a bionic forest."
Those fanged metal structures have made the sleepy, rural Alaska village of Gakona, population 200, a lightning rod for controversy. Like many federally-funded projects in the Last Frontier, HAARP saw its financial peak when former Alaska U.S. Sen. Ted Stevens was at the height of his power in the mid-2000s. Theories abound about what goes on inside HAARP, which was founded in 1990 to conduct research on the ionosphere, an upper level of the atmosphere interesting to scientists for its importance in shortwave radio communication and because it's a place where plasma forms naturally.

Ask the Air Force what they're doing in Gakona these days and a spokesman stationed in New Mexico will tell you to find out yourself during HAARP's open house. They usually hold those every couple years during the summer. Even though all the research is unclassified, the Air Force doesn't offer much else in the way of explaining what's going on, except to point out their noble interest in studying Earth's atmosphere to further scientific knowledge and maybe improve homeland security along the way.
On a theoretical level, the HAARP website notes that federal scientists are working to unlock the mysteries to other natural phenomena that have captivated humans for millennia. They're studying lightning, aurora borealis and the like. They've even learned how to induce both of those on a limited scale, according to a statement included on a Navy defense budget. HAARP also exists, the project's website notes, to learn more about shortwave radio communications and its application in global positioning systems, among other things. Maybe HAARP was used to search for Saddam's WMD. Maybe it's utilized to gather intel on Iran's underground nuclear facilities. Who knows?
Plenty of other theories have been explored about what exactly Uncle Sam is up to way out in the middle of nowhere, Alaska. Here are a few of the best conspiracy theories in a nutshell.

How HAARP Works

HAARP is an Ionospheric heater. It changes the shape of the ionosphere, allowing for beamed energy to be concentrated on a known spot. It has been speculated that this system can manipulate weather, cause earthquakes, act as a "Death Ray" delivering large amounts of energy to a small location, and put on groovy light shows.


Earthquakes

Could HAARP antennas be generating earthquakes? Eric Dubay, a conspiracy blogger and American ex-pat that lives in Thailand, is part of the crowd that believes the U.S. used HAARP to cause the 8.9-magnitude earthquake that rocked northern Japan in March 2011, leading to the devastating Fukushima-Daiichi nuclear meltdown.
The gist of the argument from Dubay and others is that waves generated by HAARP antennas are focused on a specific part of the ionosphere with enough force to make the entire thing buckle into space; the ionosphere snaps back toward the ground with enough precision to cause a massive earthquake that devastates a strategic target that furthers American economic and defense interests.
Others claim the U.S., for bizarre reasons mostly unsubstantiated, caused the January 2010 earthquake in Haiti. The best guess anyone has come up with is that Haiti was the perfect place for a test run of sorts, which is among 13 reasons included in a post on Godlike Productions that argues the U.S. should be suspected for causing the quake in Port au Prince. A column by another conspiracy theorist on UFO-Blogger.com goes a step further in trying to predict what will be hit next: "Most likely the next target will be the New Madrid fault line in the South- Midwestern United States."
Kansans can rest easy, though: Pervez Hoodbhoy, a Pakistani nuclear physicist, tears the earthquake theory to shreds in response to an Islamist group that blamed HAARP for devastating floods in Punjab.



Hurricanes

There's a storied tradition of blaming devastating hurricanes on HAARP. That trend hit a fever pitch in 2005: first it was Katrina, then Rita, then Wilma.
Interestingly, HAARP saw its funding peak that same year at $49.3 million. But why would the U.S. government want to inflict devastation on its own citizens? And how exactly would scientists in Alaska generate Atlantic hurricanes with shortwave radio communication? And what ... well, really, what else needs to be said?
"This is absolute hogwash," Stanford professor Umran Inan told Popular Science. "There's absolutely nothing we can do to disturb the earth's [weather] systems. Even though the power HAARP radiates is very large, it's miniscule compared with the power of a lightning flash -- and there are 50 to 100 lightning flashes every second. HAARP's intensity is very small."


Mind Control

Of all the conspiracies floating around about HAARP, this is perhaps the most entertaining, and scientifically farfetched.
The government is using the shortwave radio communication generated in Gakona, so the story goes, to control the minds of unsuspecting Americans.What conspiracy theorists believe the Feds are trying to control is hazy. A good place to try and get a grip on this one is at the conspiracy website HAARP.net or watch Jesse Ventura's rendition when you have a few minutes. Then go ahead and read Popular Science's rebuttal.


 

Beyond the conspiracies

What makes HAARP susceptible to conspiracy criticism is simple. The facility doesn't open its doors in the same way as other federally-funded research facilities around the country, and it doesn't go to great efforts to explain the importance of its research to the public.
If you want to visit Oak Ridge National Laboratories (a Manhattan Project-era facility with exponentially greater funding but also a heavy focus on top-secret nuclear technology) you can show up to the visitor center for a public tour or schedule something more in-depth without much hassle. You can do the same at Los Alamos -- another bastion of the Manhattan Project -- in New Mexico. At both of those facilities, journalists can access unclassified research and talk directly to researchers and scientists.
None of that is possible at HAARP, though never expressly stated, probably in part because of the tinfoil-hatters that might storm Gakona if allowed visits of any kind. When the movement for more information is spearheaded by Jesse Ventura and TruTV, it's easy enough to laugh and let the real research continue away from the public eye. But the closed-shop tendencies could prove the facility's undoing as budget hawks, like the "super" bipartisan group in Congress assigned to dig up trillions of dollars in savings over the next decade, are eager to score political points.
What's to keep HAARP from ending up on the chopping block? Perhaps opening the project up to public scrutiny might keep those federal dollars flowing to Alaska.