Do We Want Robot “Rangers” in Our Parks? Only If They’re Designed for People

AI robots are already showing up in city parks—patrolling at night, helping during pandemics, and guiding visitors. Cool idea, right? Maybe. A new study looked at how the public actually feels about these robots, and the findings are a wake-up call: people are worried.

Using 36,520 YouTube comments about park robots, the researchers ran large-scale sentiment and topic analyses. The verdict: negative reactions were common, and not just mild annoyance. Think fear of control, fear of harm, and images pulled straight from sci-fi.

That doesn’t mean park robots are doomed. It means we need human-centered design—from the shape of the robot to the rules that govern it—so the tech supports the relaxing, restorative experience parks are meant to provide.


What the public is saying

·  “Will this thing police me?”
Many commenters feared robots that monitor or control people in public spaces.

·  “Is this a weapon in waiting?”
Mentions of weaponization were common. Popular culture primes these reactions—think Terminator, Black Mirror, Star Wars.

·  “What about animals?”
People worried robots could stress wildlife—from zoo cheetahs to sheep in fields.

·  “Patrol bots feel the scariest.”
Robots built for security drew more negative reactions than those doing benign tasks.


Why this matters for parks

Parks are supposed to make us feel safe, calm, and connected. If robots trigger anxiety, we risk trading away the mental health and social benefits that parks deliver—especially for people already sensitive to surveillance or tech (like elders and kids).

At the same time, robots could help: 24/7 patrols can deter crime, reduce staff risk, and support public health during crises. The point isn’t to reject robots; it’s to build them around people.


Human-Centered Design: A simple checklist for park robots

1) Explain the “why,” “what,” and “where.”

·  Clear signage and demos: what the robot does, what it doesn’t do, and where it operates.

·  Marked robot zones so visitors can opt in—or avoid them.

2) Make control visible and simple.

·  Big, obvious help/stop buttons and QR codes for feedback.

·  Staff on-site who can answer questions and de-escalate.

3) Choose a friendly form factor.

·  Avoid silhouettes that echo military or horror tropes (yes, sci-fi aesthetics matter).

·  Use calm movement patterns; no sudden sprints or looming behavior.

4) Build trust by design, not by promise.

·  Privacy-first: no facial recognition by default; strict data minimization; short retention; third-party audits.

·  Transparency: publish capabilities and limitations; display data indicators (recording/not recording).

·  Fairness: no profiling; independent review of algorithms.

5) Protect wildlife.

·  Low noise, low speed near habitat; animal-aware sensors; no pursuit behaviors.

·  Vet routes and schedules with ecologists; monitor impacts and adapt.

6) Start small, with the community.

·  Pilot in limited areas; collect feedback from women, elders, teens, and minority groups who may experience parks differently.

·  Treat negative feedback as a design requirement, not a PR problem.


The opportunity (if we get it right)

·  Safer parks without adding risk to human staff.

·  Accessible assistance after hours and during events.

·  Better information for maintenance and conservation—done ethically.

·  Public trust in civic AI, built in the open and earned over time.


A note to city leaders, designers, and vendors

If a robot makes a park feel less welcoming, it fails—no matter how advanced it is. The study’s big lesson is clear: design for people first. That means co-creating with park users, being honest about trade-offs, and putting privacy, safety, and wildlife care front and center.

Robots can be good neighbors in green spaces. But they have to act like good neighbors, too.


Original article: Jaung, W. (2024). The need for human-centered design for AI robots in urban parks and forests. Urban Forestry & Urban Greening, 91, 128186. https://doi.org/10.1016/j.ufug.2023.128186