Artificial intelligence chatbots are becoming a dangerous source of misinformation for outdoor activities, with search and rescue teams increasingly responding to emergencies caused by ill-prepared hikers following AI advice. The case of two hikers needing rescue near Vancouver after following ChatGPT‘s guidance highlights the limitations of using AI for wilderness planning – especially its inability to provide real-time information about seasonal conditions and trail difficulty that could be lifesaving in remote environments.
The big picture: Search and rescue teams are warning against relying on AI chatbots and navigation apps for hiking preparation after rescuing hikers who followed incorrect AI advice and found themselves in dangerous conditions.
- Two hikers attempting to climb Unnecessary Mountain near Vancouver had to be rescued after following Google Maps and ChatGPT’s advice, which failed to warn them about spring snow conditions at higher elevations.
- The hikers were wearing only flat-soled sneakers when they encountered snow, requiring rescuers to bring them proper boots and ski poles.
Why this matters: AI chatbots lack critical real-time information about weather conditions, seasonal hazards, and appropriate gear requirements that can mean the difference between a safe hike and a dangerous emergency.
- Mountain Rescue England and Wales has reported a historic surge in rescue operations, which they attribute to social media influence and unreliable navigation apps.
- The incident highlights a broader concern about AI tools providing outdated or generalized information for activities where specific, current local knowledge is essential for safety.
Expert assessment: Outdoor specialists who’ve tested ChatGPT find its hiking advice to be inconsistent and potentially dangerous.
- Stephen Hui, author of “105 Hikes,” discovered that while ChatGPT can provide “decent directions” for popular trails, it struggles with obscure ones and fails to account for seasonal variations.
- Brent Calkin, leader of the Lions Bay Search and Rescue team, found that the quality of ChatGPT’s answers depended heavily on asking precisely the right questions—knowledge that novice hikers typically lack.
Between the lines: The limitations of AI in outdoor planning reveal a fundamental disconnect between technological capabilities and wilderness realities.
- ChatGPT and similar tools cannot account for changing conditions like “a storm coming in this week” that local human experts would readily share.
- Time-sensitive information is particularly crucial in regions like British Columbia, where mountain views sought by hikers are only safely accessible without special equipment from July to October.
The bottom line: Experts recommend consulting human sources with local knowledge through forums like Reddit or Facebook groups instead of relying on AI chatbots for wilderness expedition planning.
AI Chatbots Are Putting Clueless Hikers in Danger, Search and Rescue Groups Warn