×
Space

Japan's Two Hopping Rovers Successfully Land On Asteroid Ryugu (space.com) 76

sharkbiter shares a report from Space.com: The suspense is over: Two tiny hopping robots have successfully landed on an asteroid called Ryugu -- and they've even sent back some wild postcards from their new home. The tiny rovers are part of the Japan Aerospace Exploration Agency's Hayabusa2 asteroid sample-return mission. Engineers with the agency deployed the robots early Friday (Sept. 21), but JAXA waited until today (Sept. 22) to confirm the operation was successful and both rovers made the landing safely.

In order to complete the deployment, the main spacecraft of the Hayabusa2 mission lowered itself carefully down toward the surface until it was just 180 feet (55 meters) up. After the rovers were on their way, the spacecraft raised itself back up to its typical altitude of about 12.5 miles above the asteroid's surface (20 kilometers). The agency still has two more deployments yet to accomplish before it can rest easy: Hayabusa2 is scheduled to deploy a larger rover called MASCOT in October and another tiny hopper next year. And of course, the main spacecraft has a host of other tasks to accomplish during its stay at Ryugu -- most notably, to collect a sample of the primitive world to bring home to Earth for laboratory analysis.
JAXA tweeted on Saturday: "We are sorry we have kept you waiting! MINERVA-II1 consists of two rovers, 1a & 1b. Both rovers are confirmed to have landed on the surface of Ryugu. They are in good condition and have transmitted photos & data. We also confirmed they are moving on the surface."
Programming

Coding Error Sends 2019 Subaru Ascents To the Car Crusher (ieee.org) 183

An anonymous reader quotes a report from IEEE Spectrum: [A] software remedy can't solve Subaru's issue with 293 of its 2019 Ascent SUVs. All 293 of the SUVs that were built in July will be scrapped because they are missing critical spot welds. According to Subaru's recall notice [PDF] filed with the U.S. National Highway Transportation Safety Administration, the welding robots at the Subaru Indiana Automotive plant in Lafayette, Ind., were improperly coded, which meant the robots omitted the spot welds required on the Ascents' B-pillar. Consumer Reports states that the B-pillar holds the second-row door hinges. As a result, the strength of the affected Ascents' bodies may be reduced, increasing the possibility of passenger injuries in a crash. Subaru indicated in the recall that "there is no physical remedy available; therefore, any vehicles found with missing welds will be destroyed." Luckily, only nine Ascents had been sold, and those customers are going to receive new vehicles. The rest were on dealer lots or in transit.
Robotics

In a World of Robots, Carmakers Persist in Hiring More Humans (bloomberg.com) 44

It looks like car-industry employees who are concerned about robots taking their jobs don't need to worry -- for now, at least. Of the 13 publicly traded automakers with at least 100,000 workers at the end of their most-recent fiscal year, 11 had more staff compared with year-end 2013, according to data compiled by Bloomberg. Combined, they had 3.1 million employees, or 11 percent more than four years earlier, the data show. From the report: Carmakers in China and other emerging markets, where growth is strongest, favor human labor because it requires less upfront investment, said Steve Man, an analyst at Bloomberg Intelligence in Hong Kong. In developed markets, tasks that can be handled by robots were automated years ago and automakers are now boosting hiring in research and development as the industry evolves. "There's been a lot of growth in emerging markets, especially China, so that's one reason automakers are adding staff," Man said. "More staff is being added on the R&D side, with the push for autonomous, electric, connected vehicles." A trio of Chinese automakers, SAIC Motor, Dongfeng Motor Group and BYD -- in which Warren Buffett is a major investor -- increased staff by at least 24 percent. Volkswagen accounted for more than one in five jobs among the group of 13, and increased its employee count by 12 percent in the period. Things, however, look differently at General Motors, which shrank its payroll 18 percent to 180,000, and Nissan Motor, which contracted by 2.8 percent to 139,000 workers, the report added.
Robotics

Machines Are Going To Perform More Tasks Than Humans By 2025 (cnbc.com) 145

In less than a decade, most workplace tasks will be done by machines rather than humans, according to the World Economic Forum's latest AI job forecast. From a report: Machines will overtake humans in terms of performing more tasks at the workplace by 2025 -- but there could still be 58 million net new jobs created in the next five years, the World Economic Forum (WEF) said in a report on Monday. Developments in automation technologies and artificial intelligence could see 75 million jobs displaced, according to the WEF report "The Future of Jobs 2018." However, another 133 million new roles may emerge as companies shake up their division of labor between humans and machines, translating to 58 million net new jobs being created by 2022, it said. At the same time, there would be "significant shifts" in the quality, location and format of new roles, according to the WEF report, which suggested that full-time, permanent employment may potentially fall. Some companies could choose to use temporary workers, freelancers and specialist contractors, while others may automate many of the tasks. New skill sets for employees will be needed as labor between machines and humans continue to evolve, the report pointed out.
Robotics

Automation: The Exaggerated Threat of Robots (flassbeck-economics.com) 134

It will take quite a lot of time before robots become cheaper than workers in emerging markets such as Africa, argues Nico Beckert of Flassbeck Economics, a consortium of researchers who aim to provide economics insights with a more realistic basis. From the post: All industrialized countries used low-cost labour to build industries and manufacture mass-produced goods. Today, labour is relatively inexpensive in Africa, and a similar industrialization process might take off accordingly. Some worry that industrial robots will block this development path. The reason is that robots are most useful when doing routine tasks -- precisely the kind of work that is typical of labour-intensive mass production. At the moment, however, robots are much too expensive to replace thousands upon thousands of workers in labour-intensive industries, most of which are in the very early stages of the industrialization process. Robots are currently best used in technologically more demanding fields like the automobile or electronics industry.

Even a rapid drop in robot prices would not lead to the replacement of workers by robots in the short term in Africa where countries lag far behind in terms of fast internet and other information and communications technologies. They also lack well-trained IT experts. Other problems include an unreliable power supply, high energy costs and high financing costs for new technologies. For these reasons, it would be difficult and expensive to integrate robots and other digital technologies into African production lines.

AI

European Parliament Passes Resolution Calling For An International Ban On Killer Robots (bbc.com) 115

An anonymous reader quotes a report from the BBC: The European Parliament has passed a resolution calling for an international ban on so-called killer robots. It aims to pre-empt the development and use of autonomous weapon systems that can kill without human intervention. Last month, talks at the UN failed to reach consensus on the issue, with some countries saying the benefits of autonomous weapons should be explored. And some MEPs were concerned legislation could limit scientific progress of artificial intelligence. While others said it could become a security issue if some countries allowed such weapons while others did not. The resolution comes ahead of negotiations scheduled at the United Nations in November, where it is hoped an agreement on an international ban can be reached. Israel, Russia, South Korea and the U.S. opposed the new measures, saying that they wanted to explore potential "advantages" from autonomous weapons systems.
Robotics

MIT Machine Vision System Figures Out What It's Looking At By Itself (gsmarena.com) 36

MIT's "Dense Object Nets" or "DON" system uses machine vision to figure out what it's look at all by itself. "It generates a 'visual roadmap' -- basically, collections of visual data points arranged as coordinates," reports Engadget. "The system will also stitch each of these individual coordinate sets together into a larger coordinate set, the same way your phone can mesh numerous photos together into a single panoramic image. This enables the system to better and more intuitively understand the object's shape and how it works in the context of the environment around it." From the report: [T]he DON system will allow a robot to look at a cup of coffee, properly orient itself to the handle, and realize that the bottom of the mug needs to remain pointing down when the robot picks up the cup to avoid spilling its contents. What's more, the system will allow a robot to pick a specific object out of a pile of similar objects. The system relies on an RGB-D sensor which has a combination RGB-depth camera. Best of all, the system trains itself. There's no need to feed the AI hundreds upon thousands of images of an object to the DON in order to teach it. If you want the system to recognize a brown boot, you simply put the robot in a room with a brown boot for a little while. The system will automatically circle the boot, taking reference photos which it uses to generate the coordinate points, then trains itself based on what it's seen. The entire process takes less than an hour. MIT published a video on YouTube showing how the system works.
Bitcoin

Instead of Bobbleheads, Baseball Stadium Tries Handing Out Crypto Tokens (mlblogs.com) 51

The Los Angeles Dodgers will try a high-tech giveaway for their September 21st game: "Digital Bobblehead Night." DevNull127 quotes the digital editor for the Los Angeles Dodgers: While supplies last at guest's point of entry, the first 40,000 ticketed fans in attendance will receive a card with a unique code and directions to a website where a digital bobblehead can be unlocked and added to their Ethereum wallet. The player Crypto token received will be randomly selected, with approximately an equal number of Kershaw, Turner and Jansen codes distributed at the stadium gates.

"We're excited for our first-ever Crypto giveaway, and to explore an entirely new marketplace with our fanbase," said Lon Rosen, Dodger Executive Vice President and Chief Marketing Officer. "We hope this piques the interest of Dodger fans, and will help launch a new age of digital collectibles and promotions."

That stadium already has another high-tech gimmick: Flippy the Burger-Flipping Robot, who reportedly was "called up to the Majors" to help feed hungry baseball fans by cooking up fried chicken tenders and tater tots.
Robotics

MIT Graduate Creates Robot That Swims Through Pipes To Find Out If They're Leaking (fastcompany.com) 35

A 28-year-old MIT graduate named You Wu spent six years developing a low-cost robot designed to find leaks in pipes early, both to save water and to avoid bigger damage later from bursting water mains. "Called Lighthouse, the robot looks like a badminton birdie," reports Fast Company. "A soft 'skirt' on the device is covered with sensors. As it travels through pipes, propelled by the flowing water, suction tugs at the device when there's a leak, and it records the location, making a map of critical leaks to fix." From the report: MIT doctoral student You Wu spent six years developing the design, building on research that earlier students began under a project sponsored by a university in Saudi Arabia, where most drinking water comes from expensive desalination plants and around a third of it is lost to leaks. It took three years before he had a working prototype. Then Wu got inspiration from an unexpected source: At a party with his partner, he accidentally stepped on her dress. She noticed immediately, unsurprisingly, and Wu realized that he could use a similar skirt-like design on a robot so that the robot could detect subtle tugs from the suction at each leak. Wu graduated from MIT in June, and is now launching the technology through a startup called WatchTower Robotics. The company will soon begin pilots in Australia and in Cambridge, Massachusetts. One challenge now, he says, is creating a guide so water companies can use the device on their own.
AI

'I've Seen the Future of Consumer AI, and it Doesn't Have One' (theregister.co.uk) 137

Andrew Orlowski of The Register recounts all the gadgets supercharged with AI that he came across at IFA tradeshow last week -- and wonders what value AI brought to the table. He writes: I didn't see a blockchain toothbrush at IFA in Berlin last week, but I'm sure there was one lurking about somewhere. With 30 vast halls to cover, I didn't look too hard for it. But I did see many things almost as tragic that no one could miss -- AI being squeezed into almost every conceivable bit of consumer electronics. But none were convincing. If ever there was a solution looking for a problem, it's ramming AI into gadgets to show of a company's machine learning prowess. For the consumer it adds unreliability, cost and complexity, and the annoyance of being prompted.

[...] Back to LG, which takes 2018's prize for sticking AI into a superfluous gadget. The centrepiece of its AI efforts this year is a robot, ClOi. Put Google Assistant or Alexa on wheels, and you have ClOi. I asked the booth person what exactly ClOi could do to be told "it can take notes for your shopping list." Why wasn't this miracle of the Fourth Industrial Revolution let loose on the LG floor? I wondered -- a question answered by this account of ClOi's debut at CES in January. Clearly things haven't improved much -- this robot buddy was kept indoors.

Robotics

Robot Boat Sails Into History By Finishing Atlantic Crossing (apnews.com) 42

An anonymous reader writes: For the first time an autonomous sailing robot has completed the Microtransat Challenge by crossing the Atlantic from Newfoundland, Canada to Ireland. The Microtransat has been running since 2010 and has seen 23 previous entries all fail to make it across. The successful boat, SB Met was built by the Norwegian company Offshore Sensing AS and is only 2 metres (6.5 ft) long. It completed the crossing on August 26th, 79 days and 5000 km (3100 miles) of sailing after departing Newfoundland on June 7th. Further reading: A Fleet of Sailing Robots Sets Out To Quantify the Oceans.
Robotics

How Telepresence Robots Are Combating the Debilitating Effects of Isolation and Loneliness (bbc.com) 28

Internet-connected robots that can stream audio and video are increasingly helping housebound sick children and elderly people keep in touch with teachers, family and friends, combating the scourge of isolation and loneliness. BBC: Zoe Johnson, 16, hasn't been to school since she was 12. She went to the doctor in 2014 "with a bit of a sore throat", and "somehow that became A&E [accident and emergency]," says her mother, Rachel Johnson. The doctors diagnosed myalgic encephalomyelitis, ME for short, also known as Chronic Fatigue Syndrome - a debilitating illness affecting the nervous and immune systems. Zoe missed a lot of school but was able to continue with her studies with the help of an online tutor. But "over the years her real-world friendships disappeared because she's not well enough to see anybody," says Ms Johnson. For the last three months, though, she has been taking classes alongside her former classmates using a "telepresence" robot called AV1. The small, cute-looking robot, made by Oslo-based start-up No Isolation, sits in the classroom and live streams video and audio back to Zoe's tablet or smartphone at home. She can speak through the robot and take part in lessons, also controlling where AV1 is looking.
Earth

Google Funds A Starfish-Killing Robot To Save Australia's Great Barrier Reef (abc.net.au) 122

"It looks like a tiny yellow submarine, but this underwater drone is on a mission to kill," reports ABC. Specifically, to kill the starfish that are destroying coral on Australia's Great Barrier Reef. An anonymous reader quotes ABC: In a bid to eradicate the pest, Queensland researchers have developed world-first robots to administer a lethal injection to the starfish using new technology... Researcher Matt Dunbabin said the technology was 99.4 per cent accurate in delivering a toxic substance only harmful to the starfish.... Divers have played a big role in helping to combat the starfish, but Professor Dunbabin said the robot would take the efforts to the next level. "Divers currently control certain areas, but there are not enough divers to actually make a difference on the scale of the reef," he said. The drone can also monitor and gather huge amounts of data about coral bleaching, water quality and pollution.
"RangerBot will be designed to stay underwater almost three times longer than a human diver, gather vastly more data, map expansive underwater areas at scales not previously possible, and operate in all conditions and all times of the day or night," according to Researchers at the Queensland University of Technology.

The starfish-killing robots were partially funded by Google (through their Google.org Impact Challenge program to fund and support nonprofit innovators), reports The Drive. One study had found the reef's coral cover declined 50% between 1985 and 2012, "with nearly half of that drop resulting from the coral-destroying starfish species."
Communications

Two Months Later: NASA's Opportunity Rover Is Still Lost On Mars After Huge Dust Storm (space.com) 46

Two months have passed since NASA's Opportunity Mars rover last phoned home. The last time we reported on the rover was on June 12th, when it was trying to survive an intensifying dust storm that was deemed "much worse than a 2007 storm that Opportunity weathered," according to NASA. "The previous storm had an opacity level, or tau, somewhere above 5.5; this new storm had an estimated tau of 10.8." Space.com reports on Opportunity's current status: Opportunity hasn't made a peep since June 10, when dust in the Red Planet's air got so thick that the solar-powered rover couldn't recharge its batteries. Opportunity's handlers think the six-wheeled robot has put itself into a sort of hibernation, and they still hope to get a ping once the dust storm has petered out. And there are good reasons for this optimism, NASA officials said. "Because the batteries were in relatively good health before the storm, there's not likely to be too much degradation," NASA officials wrote in an Opportunity update Thursday (Aug. 16). "And because dust storms tend to warm the environment -- and the 2018 storm happened as Opportunity's location on Mars entered summer -- the rover should have stayed warm enough to survive."

Engineers are trying to communicate with Opportunity several times a week using NASA's Deep Space Network, a system of big radio dishes around the globe. They hail the robot during scheduled "wake-up times" and then listen for a response. And team members are casting a wider net, too: Every day, they sift through all radio signals received from Mars, listening for any chirp from Opportunity, NASA officials said. Even if Opportunity does eventually wake up and re-establish contact, its long ordeal may end up taking a toll on the rover.
"The rover's batteries could have discharged so much power -- and stayed inactive so long -- that their capacity is reduced," NASA officials wrote in the update. "If those batteries can't hold as much charge, it could affect the rover's continued operations. It could also mean that energy-draining behavior, like running its heaters during winter, could cause the batteries to brown out."
Robotics

Baseball Players Want Robots To Be Their Umps (technologyreview.com) 99

The sports world has been dealing with the human error of referees and umpires for decades -- it's pretty much tradition at this point. But with technology that can assess the game more accurately, some athletes are ready to push the people calling balls and strikes off the field in favor of technology. From a report: On Tuesday, Chicago Cubs second baseman Ben Zobrist, one of the most vocal supporters of turning over baseball rulings to software, used an argument with the umpire as a chance to advocate for a change in the league. The comment reinvigorated a long-standing debate over automation in sports. You're out! As you watch baseball on television, a graphic is often overlaid on the action that shows in real time whether a pitch is a ball or a strike. But human umps are still making the calls on the field based on nothing but their own eyes. Increasingly, viewers and players would rather have the technology take over.
Robotics

Children 'At Risk of Robot Influence' (bbc.co.uk) 81

An anonymous reader shares a report: Forget peer pressure, future generations are more likely to be influenced by robots, a study suggests. The research, conducted at the University of Plymouth, found that while adults were not swayed by robots, children were. The fact that children tended to trust robots without question raised ethical issues as the machines became more pervasive, said researchers. They called for the robotics community to build in safeguards for children. Those taking part in the study completed a simple test, known as the Asch paradigm, which involved finding two lines that matched in length. Known as the conformity experiment, the test has historically found that people tend to agree with their peers even if individually they have given a different answer. In this case, the peers were robots. When children aged seven to nine were alone in the room, they scored an average of 87% on the test. But when the robots joined them, their scores dropped to 75% on average. Of the wrong answers, 74% matched those of the robots.
Robotics

Ankis New Robot Has Artificial Emotional Intelligence (fastcompany.com) 35

harrymcc writes: Toymaker Anki, whose Cozmo robot has been a hit, has announced its next bot: Vector. Though it looks a lot like Cozmo, it packs far more computational power -- Cozmo relied on a phone app for smarts -- and utilizes deep-learning tech in the interest of giving Vector a subtler, more engaging personality. Over at Fast Company, Sean Captain has a deep dive into the software engineering that went into the effort. Vector is being powered by a quad-core Qualcomm Snapdragon 212 chip, and has cartoon eyes displayed on a 184 x 96-pixel screen. The robot actually scans its environment via a single 720p wide-angle camera mounted below the screen. "Cozmo springs to attention when you call its name, making twittering sounds, and lifting its bulldozer-like arms up and down," writes Captain. "If you ignore Cozmo, the bot gets more in your face, or makes loud, obnoxious snoring sounds."

While Vector can connect to the internet and display weather information, set timers, and speak answers to various questions, it's the social and visual intelligence that people may fall in love with the most. Vector is able to detect people and interact with them, even when faces aren't visible. Computer vision technical director Andrew Stein and his team "trained a convolutional neural network (CNN) -- a popular deep-learning AI technology that mimics the brains visual cortex," reports Captain. "Using the often blurry and distorted footage that Vector's camera captures as he moves around, Stein has been teaching the CNN to detect people from the back or the side, for instance, up to about 10 feet away."
Robotics

New Study Finds It's Harder To Turn Off a Robot When It's Begging For Its Life (theverge.com) 327

An anonymous reader quotes a report from The Verge: [A] recent experiment by German researchers demonstrates that people will refuse to turn a robot off if it begs for its life. In the study, published in the open access journal PLOS One, 89 volunteers were recruited to complete a pair of tasks with the help of Nao, a small humanoid robot. The participants were told that the tasks (which involved answering a series of either / or questions, like "Do you prefer pasta or pizza?"; and organizing a weekly schedule) were to improve Nao's learning algorithms. But this was just a cover story, and the real test came after these tasks were completed, and scientists asked participants to turn off the robot. In roughly half of experiments, the robot protested, telling participants it was afraid of the dark and even begging: "No! Please do not switch me off!" When this happened, the human volunteers were likely to refuse to turn the bot off. Of the 43 volunteers who heard Nao's pleas, 13 refused. And the remaining 30 took, on average, twice as long to comply compared to those who did not not hear the desperate cries at all.
Robotics

OpenAI's Dactyl System Gives Robots Humanlike Dexterity (venturebeat.com) 22

An anonymous reader quotes a report from VentureBeat: In a forthcoming paper ("Dexterous In-Hand Manipulation"), OpenAI researchers describe a system that uses a reinforcement model, where the AI [known as Dactyl] learns through trial and error, to direct robot hands in grasping and manipulating objects with state-of-the-art precision. All the more impressive, it was trained entirely digitally, in a computer simulation, and wasn't provided any human demonstrations by which to learn. The researchers used the MuJoCo physics engine to simulate a physical environment in which a real robot might operate, and Unity to render images for training a computer vision model to recognize poses. But this approach had its limitations, the team writes -- the simulation was merely a "rough approximation" of the physical setup, which made it "unlikely" to produce systems that would translate well to the real world. Their solution was to randomize aspects of the environment, like its physics (friction, gravity, joint limits, object dimensions, and more) and visual appearance (lighting conditions, hand and object poses, materials, and textures). This both reduced the likelihood of overfitting -- a phenomenon that occurs when a neural network learns noise in training data, negatively affecting its performance -- and increased the chances of producing an algorithm that would successfully choose actions based on real-world fingertip positions and object poses.

Next, the researchers trained the model -- a recurrent neural network -- with 384 machines, each with 16 CPU cores, allowing them to generate roughly two years of simulated experience per hour. After optimizing it on an eight-GPU PC, they moved onto the next step: training a convolutional neural network that would predict the position and orientation of objects in the robot's "hand" from three simulated camera images. Once the models were trained, it was onto validation tests. The researchers used a Shadow Dexterous Hand, a robotic hand with five fingers with a total of 24 degrees of freedom, mounted on an aluminum frame to manipulate objects. Two sets of cameras, meanwhile -- motion capture cameras as well as RGB cameras -- served as the system's eyes, allowing it to track the objects' rotation and orientation. In the first of two tests, the algorithms were tasked with reorienting a block labeled with letters of the alphabet. The team chose a random goal, and each time the AI achieved it, they selected a new one until the robot (1) dropped the block, (2) spent more than a minute manipulating the block, or (3) reached 50 successful rotations. In the second test, the block was swapped with an octagonal prism. The result? The models not only exhibited "unprecedented" performance, but naturally discovered types of grasps observed in humans, such as tripod (a grip that uses the thumb, index finger, and middle finger), prismatic (a grip in which the thumb and finger oppose each other), and tip pinch grip. They also learned how to pivot and slide the robot hand's fingers, and how to use gravitational, translational, and torsional forces to slot the object into the desired position.

Robotics

Human Bankers Are Losing To Robots as Nordea Sets a New Standard (bloomberg.com) 78

Something interesting happened in Swedish finance last quarter. The only big bank that managed to cut costs also happens to be behind one of the industry's boldest plans to replace humans with automation. From a report: Nordea Bank AB, whose Chief Executive Officer Casper von Koskull says his industry might only have half its current human workforce a decade from now, is cutting 6,000 of those jobs. Von Koskull says the adjustment is the only way to stay competitive in the future, with automation and robots taking over from people in everything from asset management to answering calls from retail clients. While many in the finance industry have struggled to digest that message, the latest set of bank results in Sweden suggests that executives in one of the planet's most technologically advanced corners are drawing inspiration from Nordea. At SEB AB, CEO Johan Torgeby now says that "whatever can be automated will be automated."

Slashdot Top Deals