This is default featured slide 1 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

This is default featured slide 2 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

This is default featured slide 3 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

This is default featured slide 4 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

This is default featured slide 5 title

Go to Blogger edit html and find these sentences.Now replace these sentences with your own descriptions.This theme is Bloggerized by Lasantha Bandara - Premiumbloggertemplates.com.

Showing posts with label ROBOTICS. Show all posts
Showing posts with label ROBOTICS. Show all posts

Thursday, 29 August 2013

Scientists Discover Compound to Prevent Noise-Related Hearing Loss..

Your mother was right when she warned you that loud music could damage your hearing, but now scientists have discovered exactly what gets damaged and how.


Noise-induced hearing loss, with accompanying tinnitus and sound hypersensitivity is a common condition which leads to communication problems and social isolation," said Xiaorui Shi, M.D., Ph.D., study author from the Department of Otolaryngology/Head and Neck Surgery at the Oregon Hearing Research Center at Oregon Health and Science University in Portland, Oregon. "The goal of our study is to understand the molecular mechanisms well enough to mitigate damage from exposure to loud sound."
To make this discovery, Shi and colleagues used three groups of 6 -- 8 week old mice, which consisted of a control group, a group exposed to broadband noise at 120 decibels for three hours a day for two days, and a third group given single-dose injections of pigment epithelium-derived factor (PEDF) prior to noise exposure. PEDF is a protein found in vertebrates that is currently being researched for the treatment of diseases like heart disease and cancer. The cells that secrete PEDF in control animals showed a characteristic branched morphology, with the cells arranging in a self-avoidance pattern which provided good coverage of the capillary wall. The morphology of the same cells in the animals exposed to wide-band noise, however, showed clear differences -- noise exposure caused changes in melanocytes located in the inner ear.
"Hearing loss over time robs people of their quality of life," said Gerald Weissmann, M.D., Editor-in-Chief of The FASEB Journal. "It's easy to say that we should avoid loud noises, but in reality, this is not always possible. Front-line soldiers or first responders do not have time to worry about the long-term effects of loud noise when they are giving their all. If, however, a drug could be developed to minimize the negative effects of loud noises, it would benefit one and all."

Wednesday, 31 July 2013

Bionic Contact Lenses

Bionic Contact Lenses

These are the inevitable goal for products like the Google glasses, moving from bulky spectacles to imperceptible contact lens to read text, check emails, and augment your vision with Terminator-style info or augmented sculptures and artworks. You could also wear them to play video games.
The lenses require organic materials that are biologically safe and also use inorganic material for the electronic circuits. The electronic circuits are built from a layer of metal a few nanometres thick. The light-emitting diodes are one third of a millimetre across. A grey powder is sprinkled onto the lens. Then a technique called microfabrication or ‘self-assembly’ is used to shape each tiny component. Capillary forces pull the pieces into their final position.
The technology was trialled at the University of Washington in Seattle, and currently they’re safe and feasible, but lacking a good power source and only have a single-pixel display. The device is still best characterized as a prototype. The microcircuitry only is enough to support one light-emitting diode. Still, that hasn’t prevented the team from dreaming big about various possibilities: for computing, gaming, and entertainment, among others.

Honda Develops New Robot to Survey Fukushima Nuclear Plant

Late last year, electronics company Hitachi unveiled a large, 2.5-ton robot to help clean up the Fukushima Daiichi nuclear power plant that was damaged in the 2011 Japanese tsunami. Now Honda has developed a robot of its own to aid in the cleanup effort. Using technologies which were originally developed for the ASIMO humanoid robot, Honda and the National Institute of Advanced Industrial Science and Technology (AIST) jointly developed a new high-access survey robot to collect data on the first floor of the damaged reactor.
When we think of robots doing our dirty work, we tend to imagine a humanoid robot that can walk around and do the things that a person could do. Honda and AIST’s new survey robot isn’t shaped like a human, but some of the technology it depends on was originally developed for Honda’s ASIMO, a humanoid robot that Honda has spent more than two decades developing. The development of the new survey-performing robot arm for Fukushima will in turn help accelerate the development of humanoid robots that could be used at disaster sites, Honda says.
The new survey robot rests on a mobile base, which contains a crawler platform, and it features an arm that was developed by Honda that can extend as far as 23 feet. The robot can be remote controlled via 400-meter fiber-optic wired LAN and wireless LAN. As for its unique survey abilities, the robot uses a zoom camera and a laser range finder to collect 3D data and to help identify sources of radiation. Using a 3D point cloud, the robot is able to transmit data that shows the exact shape of structures located inside the facility. According to Honda, the new robot began working inside the facility on June 18.
Source : http://www.honda.co.nz/news/news/2013/honda-jointly-develops-robot-for-post-earthquake-efforts-at-japan-s-fukushima-daiichi-nuclear-power-station/

Will.i.am planning to make singing and dancing robotic dogs

Will.i.am planning to make singing and dancing robotic dogs

The 38-year-old hip hop and R&B musician, Will.i.am, plans to create robotic dogs that can sing and dance as well as he can to help him on stage and to sell in stores by Christmas.
The Voice coach said: “Last November I was walking through Harrods and I picked up a little stuffed animal. Right next to it was a speaker. Right next to that, there was a little MiFi wireless connector box.”
“I want my technology where I can have a conversation with it. I think there is going to be something that interacts with us that is a little more personal than my phone.”
Overtaken by his inner geek, Will.i.am told Wired Magazine, “I took a day off, got the stuffed animal, ripped it apart, stuck the MiFi connector in its head, sewed it up, put the speaker in his belly, sewed it up, put a freakin’ iPhone in his mouth. It’s my prototype for the next product – we’re working to have it for Christmas.”
“Creativity is relative. You encourage it, but you can’t teach it. Everybody is not going to make it. And that’s a hard rock to swallow. If you talk to a creative person, it’s life or death.”
Will.i.am plans for the musical robotic pooches to be avaliable by Christmas 2013.

3D printer almost entirely made out of Lego

3D printer almost entirely made out of Lego

3D printers have been steadily crawling into the market and the prices have been gradually decreasing, with the least expensive 3D printer just reaching its funding goal on Kickstarter last month at an asking price of $397 USD.
An engineering student, Matthew Krueger, has been eyeing the Makerbot – a desktop 3D printer – ever since its first appearance on the market. Unfortunately, he didn’t have the funds to purchase one for himself, so he decided to make his own.
With only an old box of legos, Matthew got to work and created what he is calling the Legobot, based on the very first Makerbot Replicator introduced in January 2012. His Legobot prints using hot glue rather than 3D-printing plastics.
Althought the Legobot is mostly made out of Legos, it does, of course, have some other components. It is driven by a Lego Mindstorms NXT brick and powered by four separate supplies : 3 volts for the extruder motor, made out of a repurposed lens adjustment motor from an old VHS camera; 7.2 volts for the NXT brick; 12 volts for the fan; and 115 volts for the hot glue gun used to print things out. The gear racks were 3D-printed by a friend of Matthew’s, and some coins were used to balance the weight of the motor.
Since the Legobot uses hot glue to produce items instead of 3D printingplastics, it doesn’t print nearly as well as a Makerbot. The glue isn’t as rigid, and has only a few practical applications, at best, suitable for window stickers. At this point, the extruder must be turned on and off manually. Matthew plans to experiment with wax and resin to try to make his 3D printer a little more functional.
“While it does print, I would call this more of a prototype than a finished project,” he said of his project.
A video of the Legobot in use is here.
If you would like to construct and even modify one for yourself, Matthew has posted instructions on Instructables here.

NASA’s New Robot that Mimics the Movements of Monkeys

From giant mechanical jellyfish and manta rays to birds, the field of robotics is quickly moving up the evolutionary chain. In response to DARPA’s Robotics Challenge, NASA’s Jet Propulsion Laboratory has produced a bot that mimics the motions of monkeys. The RoboSimian has been designed for search and rescue missions, and like its living counterparts, it can swing and climb through its environment with ease.
While other competitors for the Robotics Challenge have chosen to fabricate more humanoid machines, JPL decided to take inspiration from a slightly more primitive source. The RoboSimian features four general purpose limbs, and no defined back or front. Without the need for a head, the robot has an increased range of motion and can quickly adjust and operate in any direction.
The RoboSimian is still under construction, and the model is awaiting the installation of hands and feet. The JPL team has partnered with Stanford University to design structures that will be able to grasp and manipulate objects. The DARPA contest is slated to begin in December, and the group at NASA has high hopes that the mechanical monkey will be able to accomplish the competition’s established tasks such as drive a utility vehicle, open doors, climb ladders and stairs, and break through walls. Knowing the kind of feats that real monkeys can achieve in the wild, JPL is looking forward to seeing what its artificial animal can accomplish.
Source : http://gizmodo.com/robosimian-nasa-s-new-monkey-robot-designed-for-search-824994628?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+gizmodo%2Ffull+%28Gizmodo%29

Recycling Robot Of The Future “Erases” Entire Concrete Buildings


Demolition is a messy business—not only does the process require heavy machinery and produce clouds of dust, but it also results in giant piles of rubble that often head straight for the landfill. Omer Haciomeroglu, a student at Sweden’s Umeå Institute of Design has designed Ero – a robot that recycles concrete in an energy-efficient manner and separates it from rebar and other debris on the spot. The project won the 2013 International Design Excellence Award (IDEA) in the Student Designs category.
ero-concrete-recyc-robot-04ero-concrete-recyc-robot
Heavy machines used in demolition consume large amounts of energy in order to crush concrete walls into small pieces, not to mention that demolition processes have to be accompanied by large amounts of water sprayed onto the structures to prevent the spread of dust. Once the work is done, the rubble is transported to recycle stations where waste is separated manually. Power crushers are used to pulverize the concrete and the metal is melted for reuse.
ero-concrete-recyc-robot (2)
ERO Concrete Recycling Robot can efficiently disassemble concrete structures without any waste, dust or additional separation. It is strategically placed in a building in order to scan the environment and determine the optimal way in which the operation should be executed. This smart robot has the option of switching between pulverizing and smart deconstruction modes, taking buildings down step by step. It enables reclaimed building materials to be reused as prefab concrete elements by utilizing a water jet to crack the concrete surface, separate the waste and package the dust-free material.
ero-concrete-recyc-robot-05
After deconstructing the structure with high-pressure water and sucking and separating the aggregate, cement and water, the ERO robot recycles the water back into the system. Clean aggregate is packed and labeled to be sent to concrete precast stations for reuse, while rebar is cleaned and cut, ready to be reused.
ero-concrete-recyc-robot-03
One of the goals of this project was to provide a smart and sustainable near-future approach to the demolition operations that will facilitate reuse as much as possible. Today, operators manually control different sized heavy machinery, which consume a lot of energy to smash and crush the concrete structure into dusty bits. Water has to be sprayed constantly with fire hoses to prevent harmful dust from spreading. After the work is done, big machines scoop up the rebar and concrete mixture and transfer them to the recycle stations outside the city where the waste is separated manually. Concrete needs to be crushed with power crushers in several stages, the end result of which it can only be used for simple construction layouts. The metal is melted for reuse.

Monday, 29 July 2013

iRobot Ava 500 Autonomous Telepresence Robot is Designed for Chatting


Business grows more global everyday and what was once done by a single corporation is now more likely to be spread over many small businesses. Ideally, managers and remotely-based employees would like a virtual presence at a location, but telepresence robots are often more like smartphones on remote-controlled sticks, so they lack a feeling of personal presence and naturalism. At the InfoComm 2013 Conference and Expo in Orlando, Florida, iRobot, in collaboration with Cisco, have unveiled the Ava 500; a telepresence robot that combines auto navigation and a high-definition screen for a more natural telepresence.
According to a UCLA study, seven percent of communication is verbal and 93 percent is non-verbal. Beyond this simple fact, many people are much more comfortable working with others in person and a surprising amount of work is achieved through seemingly casual conversation. A chat while leaving a meeting can make a deal and a casual observation on a factory tour can break a seemingly insoluble problem.
ava-500-12
Teleconference systems are meant to provide something like this, but specially-equipped conference rooms or carts are static and current telepresence robots have low definition video and are a bit like navigating a toy car.
iRobot has been working on the latter part of the robotic problem with its RP-VITA, which it markets for medical customers and uses an autonomous navigation system to simplify moving it about. However, the audiovisual quality was still lacking, so Cisco was brought in to mate iRobot’s Ava robotic platform with Cisco’s TelePresence EX60 21.5-inch HD resolution screen and camera to produce what it calls the first telepresence robot with high-definition video. The result is the Ava 500.
The idea is to go beyond desk and table-bound teleconferences and allow for a more personal presence and more natural interaction. According to iRobot, not only can the Ava 500 be used for meetings, but also presentations, factory tours, off-site management, “visits” by people in remote offices, and team collaboration. More importantly, it aims to allow users comfortable interaction with people at the remote location and carry on informal conversations where ideas are swapped.
Half of the equation is the Ava 500’s autonomous navigation system. Instead of the user piloting the robot around, it maps out its environment beforehand and remembers where things are and how to get from point A to point B safely and efficiently. An iPad interface is used to schedule and control the Ava 500 while the audio/video interface is a dedicated high-definition screen and camera on the user’s desk.
Transfer to the robot is seamless and direct without any complex on-screen searching and check in. The destination is selected by tapping a map or choosing a room or person from a list. Once the location is selected or a scheduled appointment time is reached, an available robot activates and heads for the destination on its own. As it travels, it senses people and other obstacles and tries to avoid them. If it’s in “public” mode, the user’s face is displayed on the screen. If it’s in “private,” it’s not.
Once at the destination, the user can control the robot from the tablet with little training and can even raise or lower the screen to accommodate others who may be sitting or standing. When finished, the Ava 500 automatically returns to its charging station. During the operation, security is provided by Cisco Aironet 1600 Series wireless access points.
The Ava 500 will available to select Cisco partners early next year. iRobot and Cisco are demonstrating the robot at the InfoComm 2013 from June 12 to 14.
The video below shows the Ava 500 in action

T8 robot tarantula gives everyone the willies

T8 robot tarantula gives everyone the willies
Legged robot kits aren’t anything new, but unlike its competition, the T8 octopod comes with a disturbingly realistic 3D-printed exoskeleton that is sure to make an unforgettable first impression. Robugtix (a robotics company based in Hong Kong) is living up to its name with the lifelike robot tarantula, and it can be yours later this year for an introductory price of US$1,350.
The T8 is powered by 26 Hitec HS-35HD servo motors (three in each leg, and additional servos to wiggle its abdomen). This a fairly small servo type with low torque, so its performance is somewhat limited, but it keeps the cost down. The company says the first batch will ship September 30th.
The company is also offering a hexapod robot called the iitsii, but that one is smaller and doesn’t have the realistic shell. It’s made out of PCB and comes with 20 servos (which are even smaller and cheaper than those in the T8), and is therefore priced at a more affordable $250. This kit will be available August 31st.
Robugtix's iitsii is a smaller, more affordable hexapod robot kit
Robugtix’s iitsii is a smaller, more affordable hexapod robot kit
Both robots come preloaded with the company’s Bigfoot Inverse Kinematics Engine to control the legs, body position, and walking gait. This means you won’t have to program the robot to move as realistically as a spider or ant, which would be pretty difficult and time-consuming to do yourself. The nice thing about inverse kinematics is the robot can tilt and shift its body menacingly while the legs remain still.
You’ll also need to buy the Robugtix Controller (an extra $85) and a single 4 x AA 4.8V NiMH rechargeable battery pack, which unfortunately aren’t included with the kit. The controller uses a wireless Xbee module to relay commands to the robot, which is essential if you’re going to have it creep around corners to prank friends and family. And if you’re interested, you’ll probably want to pre-order now as both robots will go up in price after the early bird special.
Be sure to check out the robots in action in the following videos, and keep an eye on Robugtix’s website in the coming weeks for videos of the robots walking and photos of the T8′s internal structure.

Robot astronaut Kirobo headed for ISS in August

Robot astronaut Kirobo headed for ISS in August
In what may not be the most historic event in space exploration, but may be the cutest, Toyota has announced that the Kibo Robot Project’s “robot astronaut” Kirobo will be sent to the International Space Station (ISS) on August 4. Unlike its human counterparts, the 13.4-in (34 cm) tall humanoid robot will travel aboard an unmanned Kounotori 4 cargo spacecraft launched from the Japan Aerospace Exploration Agency’s (JAXA) Tanegashima Space Center atop a H-IIB rocket. Once at the ISS, Kirobo is scheduled to conduct the first-ever robot-human conversation experiments in December.
Weighing only a kilogram (2.2 lb), Kirobo is one of two humanoid verbal-communication robots built by the Kibo Robot Project; a partnership that includes Dentsu, the Research Center for Advanced Science and Technology, the University of Tokyo, Robo Garage, and Toyota. It’s based on the commercial Robi kit robot with modifications for operating safely in zero gravity, face recognition and, according to its developers, the ability to recognize emotions. Toyota provided the speech-recognition software while Dentsu programmed the robot’s speech content as well as being responsible for overall project management.
Kirobo won’t have much to do on board the ISS at first, aside from uttering its first words in space, because it will be awaiting the arrival of Commander Koichi Wakata in November or December. Until then, conversation with Kirobo will be somewhat limited because it only speaks Japanese.
Formal conversation tests are slated to begin in December. Meanwhile the backup ”ground crew” robot Mirata will be used to verify the experiments and help troubleshoot any problems that Kirobo may encounter, as well as fulfilling public relations duties. The development partners hope that lessons learned in the experiment will help in improving their own robots. . Kirobo is expected to return to Earth in December 2014.
The video below (in Japanese) shows Kirobo going through its paces.

Rosphere spherical robot could be rolling up for work to monitor and tend crops


If you see what looks like a hamster ball rolling around a cornfield, it doesn’t mean that someone’s pet is incredibly lost. It may be an experimental robot developed by the Robotics and Cybernetics Research Group at the Universidad Politécnica de Madrid (UPM) called Rosphere. The spherical robot can propel itself over uneven ground and may one day be rolling up for work in fields to monitor and tend crops.
Spherical robots aren’t new. There have been a number built over the years for use in military operations, security, and experiments in space exploration. Rosphere’s approach is to take the simplicity of the sphere to make a robot that is low cost and a bit more general purpose. Its spherical shape gives the robot the ability to handle rough terrain, yet is safe to use around humans and delicate crops.
Mechanically, the Rosphere prototype is remarkably simple. The researchers compare the robot’s “mechatronics” to a hamster ball, which it strongly resembles except for the rubber ridges on the outside and the mechanical workings inside. Like a hamster making a ball roll by running up the sides to shift the center of gravity, the Rosphere uses an eccentric pendulum rotating on an axle to roll and steer itself.
The pendulum consists of ballast hanging by an arm from the ball’s axle. This ballast incorporates the robot’s battery and the axle carries Rosphere’s Wi-Fi antennas and electronics package. The pendulum has two rotational degrees of freedom along the transverse and longitudinal axes. By controlling the pendulum’s swing, the robot can roll forward and backward and steer.
The Rosphere uses an eccentric pendulum rotating on an axle to roll and steer itself
UPM sees the main application for Rosphere being in precision agriculture. That is, instead of tending crops by broadcasting pesticides and fertilizers and dealing with a field as a whole, small robots can tend the individual plants like a gardener. Robots like Rosphere would be able to move about crops without damaging them, making close-up examinations of local conditions and precisely applying pesticides and fertilizers.
Tests of Rosphere were conducted on a farm where it was put up against rough terrain and different soils while testing for moisture and other environmental variables. Afterwards, it was tested at the Parque del Retiro of Madrid to see if it could operate safely with people. According to UPM, the results have so far been satisfactory.
The project results were published in Industrial Robot.
The video below shows Rosphere in action.

Computer as Smart as a 4-Year-Old?


Artificial and natural knowledge researchers at the University of Illinois at Chicago have IQ-tested one of the best available artificial intelligence systems to see how intelligent it really is.
Turns out it’s about as smart as the average 4-year-old, they will report July 17 at the U.S. Artificial Intelligence Conference in Bellevue, Wash.
The UIC team put ConceptNet 4, an artificial intelligence system developed at M.I.T., through the verbal portions of the Weschsler Preschool and Primary Scale of Intelligence Test, a standard IQ assessment for young children.
They found ConceptNet 4 has the average IQ of a young child. But unlike most children, the machine’s scores were very uneven across different portions of the test.
“If a child had scores that varied this much, it might be a symptom that something was wrong,” said Robert Sloan, professor and head of computer science at UIC, and lead author on the study.
Sloan said ConceptNet 4 did very well on a test of vocabulary and on a test of its ability to recognize similarities.
“But ConceptNet 4 did dramatically worse than average on comprehension­the ‘why’ questions,” he said.
One of the hardest problems in building an artificial intelligence, Sloan said, is devising a computer program that can make sound and prudent judgment based on a simple perception of the situation or facts-the dictionary definition of commonsense.
Commonsense has eluded AI engineers because it requires both a very large collection of facts and what Sloan calls implicit facts-things so obvious that we don’t know we know them. A computer may know the temperature at which water freezes, but we know that ice is cold.
“All of us know a huge number of things,” said Sloan. “As babies, we crawled around and yanked on things and learned that things fall. We yanked on other things and learned that dogs and cats don’t appreciate having their tails pulled.” Life is a rich learning environment.
“We’re still very far from programs with commonsense-AI that can answer comprehension questions with the skill of a child of 8,” said Sloan. He and his colleagues hope the study will help to focus attention on the “hard spots” in AI research.
Study coauthors are UIC professors Stellan Ohlsson of psychology and Gyorgy Turan of mathematics, statistics and computer science; and UIC mathematical computer science undergraduate student Aaron Urasky.
The study was supported by award N00014-09-1-0125 from the Office of Naval Research and grant CCF-0916708 from the National Science Foundation.