Iphone

Apple's iPhone Plans for 2027: Foldable, or Glass and Curved. (Plus Smart Glasses, Tabletop Robot) (theverge.com) 45

An anonymous reader shared this report from the Verge: This morning, while summarizing an Apple "product blitz" he expects for 2027, Bloomberg's Mark Gurman writes in his Power On newsletter that Apple is planning a "mostly glass, curved iPhone" with no display cutouts for that year, which happens to be the iPhone's 20th anniversary... [T]he closest hints are probably in Apple patents revealed over the years, like one from 2019 that describes a phone encased in glass that "forms a continuous loop" around the device.

Apart from a changing iPhone, Gurman describes what sounds like a big year for Apple. He reiterates past reports that the first foldable iPhone should be out by 2027, and that the company's first smart glasses competitor to Meta Ray-Bans will be along that year. So will those rumored camera-equipped AirPods and Apple Watches, he says. Gurman also suggests that Apple's home robot — a tabletop robot that features "an AI assistant with its own personality" — will come in 2027...

Finally, Gurman writes that by 2027 Apple could finally ship an LLM-powered Siri and may have created new chips for its server-side AI processing.

Earlier this week Bloomberg reported that Apple is also "actively looking at" revamping the Safari web browser on its devices "to focus on AI-powered search engines." (Apple's senior VP of services "noted that searches on Safari dipped for the first time last month, which he attributed to people using AI.")
Education

Is Everyone Using AI to Cheat Their Way Through College? (msn.com) 160

Chungin Lee used ChatGPT to help write the essay that got him into Columbia University — and then "proceeded to use generative artificial intelligence to cheat on nearly every assignment," reports New York magazine's blog Intelligencer: As a computer-science major, he depended on AI for his introductory programming classes: "I'd just dump the prompt into ChatGPT and hand in whatever it spat out." By his rough math, AI wrote 80 percent of every essay he turned in. "At the end, I'd put on the finishing touches. I'd just insert 20 percent of my humanity, my voice, into it," Lee told me recently... When I asked him why he had gone through so much trouble to get to an Ivy League university only to off-load all of the learning to a robot, he said, "It's the best place to meet your co-founder and your wife."
He eventually did meet a co-founder, and after three unpopular apps they found success by creating the "ultimate cheat tool" for remote coding interviews, according to the article. "Lee posted a video of himself on YouTube using it to cheat his way through an internship interview with Amazon. (He actually got the internship, but turned it down.)" The article ends with Lee and his co-founder raising $5.3 million from investors for one more AI-powered app, and Lee says they'll target the standardized tests used for graduate school admissions, as well as "all campus assignments, quizzes, and tests. It will enable you to cheat on pretty much everything."

Somewhere along the way Columbia put him on disciplinary probation — not for cheating in coursework, but for creating the apps. But "Lee thought it absurd that Columbia, which had a partnership with ChatGPT's parent company, OpenAI, would punish him for innovating with AI." (OpenAI has even made ChatGPT Plus free to college students during finals week, the article points out, with OpenAI saying their goal is just teaching students how to use it responsibly.) Although Columbia's policy on AI is similar to that of many other universities' — students are prohibited from using it unless their professor explicitly permits them to do so, either on a class-by-class or case-by-case basis — Lee said he doesn't know a single student at the school who isn't using AI to cheat. To be clear, Lee doesn't think this is a bad thing. "I think we are years — or months, probably — away from a world where nobody thinks using AI for homework is considered cheating," he said...

In January 2023, just two months after OpenAI launched ChatGPT, a survey of 1,000 college students found that nearly 90 percent of them had used the chatbot to help with homework assignments.

The article points out ChatGPT's monthly visits increased steadily over the last two years — until June, when students went on summer vacation. "College is just how well I can use ChatGPT at this point," a student in Utah recently captioned a video of herself copy-and-pasting a chapter from her Genocide and Mass Atrocity textbook into ChatGPT.... It isn't as if cheating is new. But now, as one student put it, "the ceiling has been blown off." Who could resist a tool that makes every assignment easier with seemingly no consequences?
After using ChatGPT for their final semester of high school, one student says "My grades were amazing. It changed my life." So she continued used it in college, and "Rarely did she sit in class and not see other students' laptops open to ChatGPT."

One ethics professor even says "The students kind of recognize that the system is broken and that there's not really a point in doing this." (Yes, students are even using AI to cheat in ethics classes...) It's not just the students: Multiple AI platforms now offer tools to leave AI-generated feedback on students' essays. Which raises the possibility that AIs are now evaluating AI-generated papers, reducing the entire academic exercise to a conversation between two robots — or maybe even just one.
Robotics

Amazon Says New Warehouse Robot Can 'Feel' Items, But Won't Replace Workers (cnbc.com) 61

An anonymous reader quotes a report from CNBC: There's a new warehouse robot at Amazon that has a sense of touch, allowing it to handle a job previously only done by humans. Amazon unveiled the robot, called Vulcan, Wednesday at an event in Germany. CNBC got an exclusive first look at Vulcan in April, as it stowed items into tall, yellow bins at a warehouse in Spokane, Washington. An up-close look at the "hand" of the robot reveals how it can feel the items it touches using an AI-powered sensor to determine the precise pressure and torque each object needs. This innovative gripper helps give Vulcan the ability to manipulate 75% of the 1 million unique items in inventory at the Spokane warehouse. Amazon has used other robotic arms inside its warehouses since 2021, but those rely on cameras for detection and suction for grasp, limiting what types of objects they can handle.

Vulcan can also operate 20 hours a day, according to Aaron Parness, who heads up the Amazon Robotics team that developed the machine. Still, Parness told CNBC that instead of replacing people in its warehouses, Vulcan will create new, higher skilled jobs that involve maintaining, operating, installing and building the robots. When asked if Amazon will fully automate warehouses in the future, Parness said, "not at all." "I don't believe in 100% automation," he said. "If we had to get Vulcan to do 100% of the stows and picks, it would never happen. You would wait your entire life. Amazon understands this." The goal is for Vulcan to handle 100% of the stowing that happens in the top rows of bins, which are difficult for people to reach, Parness said. [...] Amazon said Vulcan is operating at about the same speed as a human worker and can handle items up to 8 pounds. It operates behind a fence, sequestered from human workers to reduce the risk of accidents.

Robotics

Hyundai Unleashes Atlas Robots In Georgia Plant (interestingengineering.com) 61

Hyundai Motor Group is accelerating its factory automation efforts by deploying Atlas humanoid robots from Boston Dynamics at its Metaplant America facility in Georgia, as part of a broader $21 billion U.S. investment strategy to boost efficiency and local production amid rising tariffs. InterestingEngineering reports: At Hyundai Motor Group Metaplant America, Hyundai already uses Spot robots -- four-legged machines -- for industrial inspections. In addition, the plant features a dedicated robot that removes car doors before the vehicles enter General Assembly, and a fixed robot that reinstalls the doors toward the end of the process -- a technology unique to the Georgia facility.

The South Korean automaker has not disclosed how many Atlas robots will be deployed at the facility or what specific tasks they will perform. According to reports, the company plans to further expand the use of robots across its global manufacturing facilities, streamlining processes and enhancing efficiency. [...] The automaker aims to manufacture 300,000 electric and hybrid vehicles annually at the new facility. At its recent Grand Opening Ceremony, the company announced plans to ramp up production to 500,000 units over time, without specifying a timeline.

Robotics

Disneyland Imagineers Defend New Show Recreating Walt Disney as a Robot (yahoo.com) 27

"When Disneyland turns 70 this July, Main Street's Opera House will play host to the return of Walt Disney, who will sit down with audiences to tell his story in robot form," writes Gizmodo.

But they point out Walt's granddaughter Johanna Miller wrote a Facebook post opposing the idea in November. ("They are Dehumanizing him. People are not replaceable...") The idea of a Robotic Grampa to give the public a feeling of who the living man was just makes no sense. It would be an imposter... You could never get the casual ness of his talking interacting with the camera his excitement to show and tell people about what is new at the park.

You can not add life to one. Empty of a soul or essence of the man. Knowing that he did not want this. Having your predecessors tell you that this was out of bounds.... So so Sad and disappointed.

The Facebook post claims that the son of a Disney engineer even remembers Walt saying that he never wanted to be an animatronic himself. And "Members of the Walt Disney family are said to be divided," reports the Los Angeles Times, "with many supporting the animatronic and some others against it, say those in the know who have declined to speak on the record for fear of ruining their relationships."

So that Facebook post "raised anew ethical questions that often surround any project attempting to capture the dead via technology," their article adds, "be it holographic representations of performers or digitally re-created cinematic animations. And then some media outlets got a partial preview Wednesday, the Los Angeles Times reports: An early sculpt of what would become the animatronic was revealed, one complete with age spots on Disney's hands and weariness around his eyes — Imagineers stressed their intent is faithful accuracy — but much of the attraction remains secretive. The animatronic wasn't shown, nor did Imagineering provide any images of the figure, which it promises will be one of its most technically advanced. Instead, Imagineering sought to show the care in which it was bringing Disney back to life while also attempting to assuage any fears regarding what has become a much-debated project among the Disney community...

Longtime Imagineer Tom Fitzgerald, known for his work on beloved Disney projects such as Star Tours and the Guardians of the Galaxy coaster in Florida, said Wednesday that "A Magical Life" has been in the works for about seven years. Asked directly about ethical concerns in representing the deceased via a robotic figurine, Fitzgerald noted the importance of the Walt Disney story, not only to the company but to culture at large... "What could we do at Disneyland for our audience that would be part of our tool kit vernacular but that would bring Walt to life in a way that you could only experience at the park? We felt the technology had gotten there. We felt there was a need to tell that story in a fresh way...."

"Walt Disney — A Magical Life" will walk a fine line when it opens, attempting to inspire a new generation to look into Disney's life while also portraying him as more than just a character in the park's arsenal. "Why are we doing this now?" Fitzgerald says. "For two reasons. One is Disneyland's 70th anniversary is an ideal time we thought to create a permanent tribute to Walt Disney in the Opera House. The other: I grew up watching Walt Disney on television. I guess I'm the old man. He came into our living room every week and chatted and it was very casual and you felt like you knew the man. But a lot of people today don't know Walt Disney was an individual. They think Walt Disney is a company."

And now nearly 60 years after his death, Disney will once again grace Main Street, whether or not audiences — or even some members of his family — are ready to greet him.

Robotics

AI-Driven Robot Installs Nearly 10,000 Solar Modules in Australia (cleantechnica.com) 56

Long-time Slashdot reader AmiMoJo shares an article from Renewables Now: Chinese tech company Leapting has successfully completed its first commercial deployment of photovoltaic (PV) modules with an AI-driven solar module mounting robot in Australia. The Chinese company was tasked with supporting the installation of French Neoen's (EPA:NEOEN) 350-MW/440-MWp Culcairn Solar Farm in New South Wales' Riverina region. Shanghai-based Leapting said this week that its intelligent robot has installed almost 10,000 modules at an "efficient, safe, and stable" pace that has "significantly" reduced the original construction timeline.

Litian Intelligent was deployed at the Australian project site in early February. The machine has a 2.5-metre-high robotic arm sitting on a self-guided, self-propelled crawler. Equipped with a navigation system, and visual recognition technology, it can lift and mount PV panels weighing up to 30 kilograms. By replacing labour-intensive manual operations, the robot shortens the module installation cycle by 25%, while the installation efficiency increases three to five times as compared to manual labour and is easily adapted to complex environments, Leapting says.

Or, as Clean Technica puts it, "Meet the robot replacing four workers at a time on solar projects." This is part of a broader industrial trend. In the United States, Rosendin Electric demonstrated its own semi-autonomous system in Texas that allowed a two-person team to install 350 to 400 modules per day, a clear step-change from traditional methods. AES Corporation has been developing a robot called Maximo that combines placement and fastening with computer vision. Trina Solar's Trinabot in China operates in a similar space, with prototype systems demonstrating 50-plus modules per hour... In an industry where time-to-energy is critical, shaving weeks off the construction schedule directly reduces costs and increases net revenue...

[T]he direction is clear. The future of solar construction will be faster, safer, and more precise — not because of human brawn, but because of robotic repetition. There will still be humans on-site, but their role shifts from lifting panels to managing throughput. Just as cranes and excavators changed civil construction, so too will robots like Leapting's define the next era of solar deployment.

Robotics

Soft Vine-Like Robot Helps Rescuers Find Survivors In Disaster Zones (mit.edu) 15

New submitter MicroBitz shares a report: SPROUT, short for Soft Pathfinding Robotic Observation Unit, is a flexible, vine-like robot developed by MIT Lincoln Laboratory in collaboration with the University of Notre Dame. Unlike rigid robots or static cameras, SPROUT can "grow" into tight, winding spaces that are otherwise inaccessible, giving first responders a new way to explore, map and assess collapsed structures. Beyond disaster response, the technology could be adapted for inspecting military systems or critical infrastructure in hard-to-reach places, making SPROUT a versatile tool for a variety of high-stakes scenarios. "The urban search-and-rescue environment can be brutal and unforgiving, where even the most hardened technology struggles to operate. The fundamental way a vine robot works mitigates a lot of the challenges that other platforms face," says Chad Council, a member of the SPROUT team, which is led by Nathaniel Hanson.

"The mechanical performance of the robots has an immediate effect, but the real goal is to rethink the way sensors are used to enhance situational awareness for rescue teams," adds Hanson. "Ultimately, we want SPROUT to provide a complete operating picture to teams before anyone enters a rubble pile."

You can see the SPROUT vine robot in action in a YouTube video from MIT Lincoln Laboratory.
Robotics

China Pits Humanoid Robots Against Humans In Half-Marathon (msn.com) 25

An anonymous reader quotes a report from Reuters: Twenty-one humanoid robots joined thousands of runners at the Yizhuang half-marathon in Beijing on Saturday, the first time these machines have raced alongside humans over a 21-km (13-mile) course. The robots from Chinese manufacturers such as DroidVP and Noetix Robotics came in all shapes and sizes, some shorter than 120 cm (3.9 ft), others as tall as 1.8 m (5.9 ft). One company boasted that its robot looked almost human, with feminine features and the ability to wink and smile.

Some firms tested their robots for weeks before the race. Beijing officials have described the event as more akin to a race car competition, given the need for engineering and navigation teams. "The robots are running very well, very stable ... I feel I'm witnessing the evolution of robots and AI," said spectator He Sishu, who works in artificial intelligence. The robots were accompanied by human trainers, some of whom had to physically support the machines during the race.

A few of the robots wore running shoes, with one donning boxing gloves and another wearing a red headband with the words "Bound to Win" in Chinese. The winning robot was Tiangong Ultra, from the Beijing Innovation Center of Human Robotics, with a time of 2 hours and 40 minutes. The men's winner of the race had a time of 1 hour and 2 minutes. [...] Some robots, like Tiangong Ultra, completed the race, while others struggled from the beginning. One robot fell at the starting line and lay flat for a few minutes before getting up and taking off. One crashed into a railing after running a few metres, causing its human operator to fall over.
You can watch a recording of the race in its entirety on YouTube.
Robotics

Harvard's RoboBee Masters Landing, Paving Way For Agricultural Pollination (chosun.com) 31

After more than a decade of development, Harvard's insect-sized flying robot, RoboBee, has successfully learned to land using dragonfly-inspired legs and improved flight controls. The researchers see RoboBee as a potential substitute for endangered bees, assisting in the pollination of plants. From a report: RoboBee is a micro flying robot that Harvard has been developing since 2013. As the name suggests, it is the size of a bee, capable of flying like a bee and hovering in mid-air. Its wings are 3 cm long and it weighs only 0.08 g. The weight was reduced by using light piezoelectric elements instead of motors. Piezoelectric elements change shape when an electric current flows through them. The researchers were able to make RoboBee flap its wings 120 times per second by turning the current on and off, which is similar to actual insects.

While RoboBee exhibited flight capabilities comparable to those of a bee, the real problem was landing. Being too light and having short wings, it could not withstand the air turbulence generated during landing. It is easy to understand if you think about the strong winds generated when a helicopter approaches the ground. Christian Chan, a graduate student at Harvard who participated in the research, said, "Until now, it was a matter of shutting off the robot while it attempted to land and praying for a proper touchdown."

To ensure RoboBee's safe landing, it was important to dissipate energy just before touchdown. Hyun Nak-Seung, a professor at Purdue University who participated in the development of RoboBee, explained, "For any flying object, the success of landing depends on minimizing speed just before impact and rapidly dissipating energy afterward. Even for tiny flapping like RoboBee's, the ground effect cannot be ignored, and after landing, the risk of bouncing or rolling makes the situation more complex."
The findings have been published in the journal Science Robotics.
Google

Samsung and Google Partner To Launch Ballie Home Robot with Built-in Projector (engadget.com) 25

Samsung Electronics and Google Cloud are jointly entering the consumer robotics market with Ballie, a yellow, soccer-ball-shaped robot equipped with a video projector and powered by Google's Gemini AI models. First previewed in 2020, the long-delayed device will finally launch this summer in the US and South Korea. The mobile companion uses small wheels to navigate homes autonomously and integrates with Samsung's SmartThings platform to control smart home devices.

Running on Samsung's Tizen operating system, Ballie can manage calendars, answer questions, handle phone calls, and project video content from services including YouTube and Netflix. Samsung EVP Jay Kim described it as a "completely new Ballie" compared to the 2020 version, with Google Cloud integration being the most significant change. The robot leverages Gemini for understanding commands, searching the web, and processing visual data for navigation, while using Samsung's AI models for accessing personal information.
AI

New Tinder Game 'Lets You Flirt With AI Characters. Three of Them Dumped Me' (msn.com) 72

Tinder "is experimenting with a chatbot that claims to help users improve their flirting skills," notes Washington Post internet-culture reporter Tatum Hunter. The chatbot is available only to users in the United States on iPhones for a limited time, and powered by OpenAI's GPT-4o each character "kicks off an improvised conversation, and the user responds out loud with something flirty..."

"Three of them dumped me." You can win points for banter the app deems "charming" or "playful." You lose points if your back-and-forth seems "cheeky" or "quirky"... It asked me to talk out loud into my phone and win the romantic interest of various AI characters.

The first scenario involved a financial analyst named Charles, whom I've supposedly run into at the Tokyo airport after accidentally swapping our luggage. I tried my best to be polite to the finance guy who stole my suitcase, asking questions about his travel and agreeing to go to coffee. But the game had some critical feedback: I should try to connect more emotionally using humor or stories from my life. My next go had me at a Dallas wedding trying to flirt with Andrew, a data analyst who had supposedly stumbled into the venue, underdressed, because he'd been looking for a quiet spot to ... analyze data. This time I kept things playful, poking fun at Andrew for crashing a wedding. Andrew didn't like that. I'd "opted to disengage" by teasing this person instead of helping him blend in at the wedding, the app said. A failure on my part, apparently — and also a reminder why generative AI doesn't belong everywhere...

Going in, I was worried Tinder's AI characters would outperform the people I've met on dating apps and I'd fall down a rabbit hole of robot love. Instead, they behaved in a way typical for chatbots: Drifting toward biased norms and failing to capture the complexity of human emotions and interactions. The "Game Game" seemed to replicate the worst parts of flirting — the confusion, the unclear expectations, the uncomfortable power dynamics — without the good parts, like the spark of curiosity about another person. Tinder released the feature on April Fools' Day, likely as a bid for impressions and traffic. But its limitations overshadowed its novelty...

Hillary Paine, Tinder's vice president of product, growth and revenue, said in an email that AI will play a "big role in the future of dating and Tinder's evolution." She said the game is meant to be silly and that the company "leaned into the campiness." Gen Z is a socially anxious generation, Paine said, and this age group is willing to endure a little cringe if it leads to a "real connection."

The article suggests it's another example of companies "eager to incorporate this newish technology, often without considering whether it adds any value for users." But "As apps like Tinder and Bumble lose users amid 'dating app burnout,' the companies are turning to AI to win new growth." (The dating app Rizz "uses AI to autosuggest good lines to use," while Teaser "spins up a chatbot that's based on your personality, meant to talk and behave like you would during a flirty chat," and people "are forming relationships with AI companion bots by the millions.") And the companion-bot company Replika "boasts more than 30 million users..."
Robotics

China is Already Testing AI-Powered Humanoid Robots in Factories (msn.com) 71

The U.S. and China "are racing to build a truly useful humanoid worker," the Wall Street Journal wrote Saturday, adding that "Whoever wins could gain a huge edge in countless industries."

"The time has come for robots," Nvidia's chief executive said at a conference in March, adding "This could very well be the largest industry of all." China's government has said it wants the country to be a world leader in humanoid robots by 2027. "Embodied" AI is listed as a priority of a new $138 billion state venture investment fund, encouraging private-sector investors and companies to pile into the business. It looks like the beginning of a familiar tale. Chinese companies make most of the world's EVs, ships and solar panels — in each case, propelled by government subsidies and friendly regulations. "They have more companies developing humanoids and more government support than anyone else. So, right now, they may have an edge," said Jeff Burnstein [president of the Association for Advancing Automation, a trade group in Ann Arbor, Michigan]....

Humanoid robots need three-dimensional data to understand physics, and much of it has to be created from scratch. That is where China has a distinct edge: The country is home to an immense number of factories where humanoid robots can absorb data about the world while performing tasks. "The reason why China is making rapid progress today is because we are combining it with actual applications and iterating and improving rapidly in real scenarios," said Cheng Yuhang, a sales director with Deep Robotics, one of China's robot startups. "This is something the U.S. can't match." UBTech, the startup that is training humanoid robots to sort and carry auto parts, has partnerships with top Chinese automakers including Geely... "A problem can be solved in a month in the lab, but it may only take days in a real environment," said a manager at UBTech...

With China's manufacturing prowess, a locally built robot could eventually cost less than half as much as one built elsewhere, said Ming Hsun Lee, a Bank of America analyst. He said he based his estimates on China's electric-vehicle industry, which has grown rapidly to account for roughly 70% of global EV production. "I think humanoid robots will be another EV industry for China," he said. The UBTech robot system, called Walker S, currently costs hundreds of thousands of dollars including software, according to people close to the company. UBTech plans to deliver 500 to 1,000 of its Walker S robots to clients this year, including the Apple supplier Foxconn. It hopes to increase deliveries to more than 10,000 in 2027.

Few companies outside China have started selling AI-powered humanoid robots. Industry insiders expect the competition to play out over decades, as the robots tackle more-complicated environments, such as private homes.

The article notes "several" U.S. humanoid robot producers, including the startup Figure. And robots from Amazon's Agility Robotics have been tested in Amazon warehouses since 2023. "The U.S. still has advantages in semiconductors, software and some precision components," the article points out.

But "Some lawmakers have urged the White House to ban Chinese humanoids from the U.S. and further restrict Chinese robot makers' access to American technology, citing national-security concerns..."
Robotics

US Robotics Companies Push For National Strategy To Compete With China (apnews.com) 50

U.S. robotics companies, including Tesla and Boston Dynamics, are urging lawmakers to establish a national robotics strategy to keep pace with China's aggressive investment in AI-driven robotics. The Associated Press reports: Jeff Cardenas, co-founder and CEO of humanoid startup Apptronik, of Austin, Texas, pointed out to lawmakers that it was American carmaker General Motors that deployed the first industrial robot at a New Jersey assembly plant in 1961. But the U.S. then ceded its early lead to Japan, which remains a powerhouse of industrial robotics, along with Europe. The next robotics race will be powered by artificial intelligence and will be "anybody's to win," Cardenas said in an interview after the closed-door meeting. "I think the U.S. has a great chance of winning. We're leading in AI, and I think we're building some of the best robots in the world. But we need a national strategy if we're going to continue to build and stay ahead."

The Association for Advancing Automation said a national strategy would help U.S. companies scale production and drive the adoption of robots as the "physical manifestation" of AI. The group made it clear that China and several other countries already have a plan in place. Without that leadership, "the U.S. will not only lose the robotics race but also the AI race," the association said in a statement. The group also suggested tax incentives to help drive adoption, along with federally-funded training programs and funding for both academic research and commercial innovation. A new federal robotics office, the association argued, is necessary partly because of "the increasing global competition in the space" as well as the "growing sophistication" of the technology.

Robotics

Nvidia Says 'the Age of Generalist Robotics Is Here' (theverge.com) 125

During the company's GTC 2025 keynote today, Nvidia founder and CEO Jensen Huang announced Isaac GR00T N1 -- the company's first open-source, pre-trained yet customizable foundation model designed to accelerate the development and capabilities of humanoid robots. "The age of generalist robotics is here," said Huang. "With Nvidia Isaac GR00T N1 and new data-generation and robot-learning frameworks, robotics developers everywhere will open the next frontier in the age of AI." The Verge reports: Huang demonstrated 1X's NEO Gamma humanoid robot performing autonomous tidying jobs using a post-trained policy built on the GR00T N1 model. [...] Other companies developing humanoid robots who have had early access to the GR00T N1 model include Boston Dynamics, the creators of Atlas; Agility Robotics; Mentee Robotics; and Neura Robotics. Originally announced as Project GR00T a year ago, the GR00T N1 foundation model utilizes a dual-system architecture inspired by human cognition.

System 1, as Nvidia calls it, is described as a "fast-thinking action model" that behaves similarly to human reflexes and intuition. It was trained on data collected through human demonstrations and synthetic data generated by Nvidia's Omniverse platform. System 2, which is powered by a vision language model, is a "slow-thinking model" that "reasons about its environment and the instructions it has received to plan actions." Those plans are passed along to System 1, which translates them into "precise, continuous robot movements" that include grasping, moving objects with one or two arms, as well as more complex multistep tasks that involve combinations of basic skills.

While the GR00T N1 foundation model is pretrained with generalized humanoid reasoning and skills, developers can customize its behavior and capabilities for specific needs by post-training it with data gathered from human demonstrations or simulations. Nvidia has made GR00T N1 training data and task evaluation scenarios available for download through Hugging Face and GitHub.

Mars

Elon Musk Says SpaceX's First Mission to Mars Will Launch Next Year (bbc.co.uk) 297

"SpaceX founder Elon Musk has said his Starship rocket will head to Mars by the end of next year," writes the BBC, "as the company investigates several recent explosions in flight tests." Human landings could begin as early as 2029 if initial missions go well, though "2031 was more likely", he added in a post on his social media platform X...

The billionaire said in 2020 that he remained confident that his company would land humans on Mars six years later. In 2024, he said he would launch the first Starships to Mars in 2026, with plans to send crewed flights in four years.

Musk has said that the coming Mars mission would carry the Tesla humanoid robot "Optimus", which was shown to the public last year.

Nintendo

Super Nintendo Hardware Is Running Faster As It Ages (404media.co) 42

An anonymous reader quotes a report from 404 Media: Something very strange is happening inside Super Nintendo (SNES) consoles as they age: a component you've probably never heard of is running ever so slightly faster as we get further and further away from the time the consoles first hit the market in the early '90s. The discovery started a mild panic in the speedrunning community in late February since one theoretical consequence of a faster-running console is that it could impact how fast games are running and therefore how long they take to complete. This could potentially wreak havoc on decades of speedrunning leaderboards and make tracking the fastest times in the speedrunning scene much more difficult, but that outcome now seems very unlikely. However, the obscure discovery does highlight the fact that old consoles' performance is not frozen at the time of their release date, and that they are made of sensitive components that can age and degrade, or even 'upgrade', over time. The idea that SNESs are running faster in a way that could impact speedrunning started with a Bluesky post from Alan Cecil, known online as dwangoAC and the administrator of TASBot (short for tool-assisted speedrun robot), a robot that's programmed to play games faster and better than a human ever could.

[...] So what's going on here? The SNES has an audio processing unit (APU) called the SPC700, a coprocessor made by Sony for Nintendo. Documentation given to game developers at the time the SNES was released says that the SPC700 should have a digital signal processing (DSP) rate of 32,000hz, which is set by a ceramic resonator that runs 24.576Mhz on that coprocessor. We're getting pretty technical here as you can see, but basically the composition of this ceramic component and how it resonates when connected to an electronic circuit generates the frequency for the audio processing unit, or how much data it processes in a second. It's well documented that these types of ceramic resonators are sensitive and can run at higher frequencies when subject to heat and other external conditions. For example, the chart [here], taken from an application manual for Murata ceramic resonators, shows changes in the resonators' oscillation under different physical conditions.

As Cecil told me, as early as 2007 people making SNES emulators noticed that, despite documentation by Nintendo that the SPC700 should run at 32,000Hz, some SNESs ran faster. Emulators generally now emulate at the slightly higher frequency of 32,040Hz in order to emulate games more faithfully. Digging through forum posts in the SNES homebrew and emulation communities, Cecil started to put a pattern together: the SPC700 ran faster whenever it was measured further away from the SNES's release. Data Cecil collected since his Bluesky post, which now includes more than 140 responses, also shows that the SPC700 is running faster. There is still a lot of variation, in theory depending on how much an SNES was used, but overall the trend is clear: SNESs are running faster as they age, and the fastest SPC700 ran at 32,182Hz. More research shared by another user in the TASBot Discord has even more detailed technical analysis which appears to support those findings.
"We don't yet know how much of an impact it will have on a long speedrun," Cecil told 404 Media. "We only know it has at least some impact on how quickly data can be transferred between the CPU and the APU."

Cecil said minor differences in SNES hardware may not affect human speedrunners but could impact TASBot's frame-precise runs, where inputs need to be precise down to the frame, or "deterministic."
Robotics

Google's New Robot AI Can Fold Delicate Origami, Close Zipper Bags (arstechnica.com) 28

An anonymous reader quotes a report from Ars Technica: On Wednesday, Google DeepMind announced two new AI models designed to control robots: Gemini Robotics and Gemini Robotics-ER. The company claims these models will help robots of many shapes and sizes understand and interact with the physical world more effectively and delicately than previous systems, paving the way for applications such as humanoid robot assistants. [...] Google's new models build upon its Gemini 2.0 large language model foundation, adding capabilities specifically for robotic applications. Gemini Robotics includes what Google calls "vision-language-action" (VLA) abilities, allowing it to process visual information, understand language commands, and generate physical movements. By contrast, Gemini Robotics-ER focuses on "embodied reasoning" with enhanced spatial understanding, letting roboticists connect it to their existing robot control systems. For example, with Gemini Robotics, you can ask a robot to "pick up the banana and put it in the basket," and it will use a camera view of the scene to recognize the banana, guiding a robotic arm to perform the action successfully. Or you might say, "fold an origami fox," and it will use its knowledge of origami and how to fold paper carefully to perform the task.

In 2023, we covered Google's RT-2, which represented a notable step toward more generalized robotic capabilities by using Internet data to help robots understand language commands and adapt to new scenarios, then doubling performance on unseen tasks compared to its predecessor. Two years later, Gemini Robotics appears to have made another substantial leap forward, not just in understanding what to do but in executing complex physical manipulations that RT-2 explicitly couldn't handle. While RT-2 was limited to repurposing physical movements it had already practiced, Gemini Robotics reportedly demonstrates significantly enhanced dexterity that enables previously impossible tasks like origami folding and packing snacks into Zip-loc bags. This shift from robots that just understand commands to robots that can perform delicate physical tasks suggests DeepMind may have started solving one of robotics' biggest challenges: getting robots to turn their "knowledge" into careful, precise movements in the real world.
DeepMind claims Gemini Robotics "more than doubles performance on a comprehensive generalization benchmark compared to other state-of-the-art vision-language-action models."

Google is advancing this effort through a partnership with Apptronik to develop next-generation humanoid robots powered by Gemini 2.0. Availability timelines or specific commercial applications for the new AI models were not made available.
Businesses

Roomba-maker iRobot Warns of Possible Shutdown Within 12 Months (irobot.com) 77

Roomba maker iRobot has warned it may cease operations within 12 months unless it can refinance debt or find a buyer, just one day after launching a new vacuum cleaner line. In its March 12 quarterly report, the company disclosed it had spent $3.6 million to amend terms on a $200 million Carlyle Group loan from 2023, as U.S. revenue plunged 47% in the fourth quarter.

"Given these uncertainties and the implication they may have on the Company's financials, there is substantial doubt about the Company's ability to continue as a going concern for a period of at least 12 months from the date of the issuance of its consolidated 2024 financial statements," the company wrote.

The robot vacuum pioneer has initiated a formal strategic review after a failed Amazon acquisition, the departure of founder Colin Angle, and layoffs affecting over half its workforce. iRobot cited mounting competition from Chinese manufacturers and expects continued losses for "the foreseeable future."
AI

Will an 'AI Makeover' Help McDonald's? (msn.com) 100

"McDonald's is giving its 43,000 restaurants a technology makeover," reports the Wall Street Journal, including AI-enabled drive-throughs and AI-powered tools for managers — as well as internet-connected kitchen equipment.

"Technology solutions will alleviate the stress...." says McDonald's CIO Brian Rice. McDonald's tapped Google Cloud in late 2023 to bring more computing power to each of its restaurants — giving them the ability to process and analyze data on-site... a faster, cheaper option than sending data to the cloud, especially in more far-flung locations with less reliable cloud connections, said Rice... Edge computing will enable applications like predicting when kitchen equipment — such as fryers and its notorious McFlurry ice cream machines — is likely to break down, Rice said. The burger chain said its suppliers have begun installing sensors on kitchen equipment that will feed data to the edge computing system and give franchisees a "real-time" view into how their restaurants are operating. AI can then analyze that data for early signs of a maintenance problem.

McDonald's is also exploring the use of computer vision, the form of AI behind facial recognition, in store-mounted cameras to determine whether orders are accurate before they're handed to customers, he said. "If we can proactively address those issues before they occur, that's going to mean smoother operations in the future," Rice added...

Additionally, the ability to tap edge computing will power voice AI at the drive-through, a capability McDonald's is also working with Google's cloud-computing arm to explore, Rice said. The company has been experimenting with voice-activated drive-throughs and robotic deep fryers since 2019, and ended its partnership with International Business Machines to test automated order-taking at the drive-through in 2024.

Edge computing will also help McDonald's restaurant managers oversee their in-store operations. The burger giant is looking to create a "generative AI virtual manager," Rice said, which handles administrative tasks such as shift scheduling on managers' behalf. Fast-food giant Yum Brands' Pizza Hut and Taco Bell have explored similar capabilities.

Robotics

World's First Front-Flippin' Humanoid Robot (newatlas.com) 23

Chinese robotics company Zhongqing Robotics (also known as Engine AI) posted a video of what is claimed to be the world's first humanoid robot front flip. New Atlas reports: Ten years ago, this kind of stuff simply did not exist. And now you can buy your very own open-source PM01 robot for US$13,700, according to EngineAI's website. Its specs aren't bad: 5-DoF (degrees of freedom) in each arm and six per leg. That's 23-DoF in body movement in total. The bot also boasts 221 lb-ft of torque (300 Nm), which seems like quite a punch when the little guy is only 4.5 ft (138 cm) tall and weighs 88 lb (40kg). You can watch the video here.

Slashdot Top Deals