Travelers today use more and more tools, such as Chatgpt, to get ideas for their travels, but eventually end up in destinations that do not exist on the map …
Miguel Angel Gonzora Meza, owner of a travel agency, was in a rural city of Peru and was preparing for a hiking in the Andeswhen he heard a strange discussion: Two tourists were discussing their plans to hike on their own in the mountains to the “Humantay Holy Gorge”.
Of course, this place did not exist in the country and the tourists paid to find themselves in a non -existent destination.
But this seemingly innocent mistake could It is and fatalas he explains, the misinformation could lead them to hazardous trailsat high altitude, without oxygen, driver and signal on the mobile.
According to the BBC, within just a few years, the tools of artificial intelligence Like Chatgpt, Microsoft Copilot and Google Gemini have evolved into an integral part of travel planning for millions of people. According to a survey, the 30% of international travelers He now uses tools of artificial intelligence and special artificial intelligence sites for travel, such as Wonderplan and Laylato organize his excursions.
Although these programs can offer valuable travel tips When working properly, they can also lead people to some frustrating or even dangerous situations, when they do not work.
This is a lesson taken by some travelers when they arrive at their destination and find that they have been given to them Incorrect information or that they have been led to a part created with the imagination of a robot.
Dana Yao and her husband experienced it firsthand. They used the Chatgpt to organize a romantic hike in Top of Mount Misen, on the Japanese island of Itsukushima. After exploring the city of Miyajima with no problems, they started at 15:00 to climb to top of the mountain to prevent the sunsetjust as the chatgpt had suggested to them.
But when they tried to descend the mountain, the cable car was closedwhile chatgpt had given them other information. So they stayed on top of the mountain without having to return.
In another incident, in 2024, the travel agent of artificial intelligence Layla reported to users how There was Eiffel Tower in Beijing, while in a British he had proposed to take a marathon on a route in northern Italy that was not.
According to a 2024 survey, 37% of respondents who used artificial intelligence to design their travels reported that could not provide them with enough informationwhile about the 33% stated that the suggestions he gave them contained false information.
These issues arise from the way in which artificial intelligence creates her answers.
As Ragid Gani explains, a distinguished professor of mechanical learning at Carnegie Mellon University, programs such as Chatgpt may seem to give reasonable and useful tips. But way by which they receive these information It shows that we can never be absolutely confident.
“Does not know the difference between travel tips, instructions or prescriptions‘, says the expert. “Only knows words. So he continues to write words that sound realistic ‘, notes.
*The photo of the central image is the result AI and shows the Eiffel Tower in Beijing
Source :Skai
I am Terrance Carlson, author at News Bulletin 247. I mostly cover technology news and I have been working in this field for a long time. I have a lot of experience and I am highly knowledgeable in this area. I am a very reliable source of information and I always make sure to provide accurate news to my readers.