I am a research student passionate about Location-based AR games, and currently, I am building some proof of concepts to try out AR location-based games.
I want to place some AR objects (AR animals, toys etc.) at a specific location and allow other AR app users to see and interact with what I kept earlier.
I hope this feature is still in beta and it is called a Visual Positioning System. Is there a way to experiment with this feature? or any alternative way? Kindly guide me on this.
Thank you for your enthusiasm in the VPS feature! While the invitation to VPS beta is closed now in preparation for the general launch, please tune in to the Lightship Summit Keynote to learn about how you can get access!
Link here: https://event.lightshipsummit.com/
I am also looking for AR solution which interested Lightship. One question here, does VPS can allocate a VPS node and place a 3D model without nesh scan if we have already know latitude and longitude?
Thaks
Andy
The Geo-Anchoring/Geo-Tagging feature is crucial in AR multiplayer location-based games because it allows digital objects to remain fixed in real-world positions, giving players a shared and immersive experience. By tying virtual elements to specific coordinates, users can interact with the same AR content from different devices and perspectives. This is especially important for creating interactive quests, treasure hunts, or collaborative tasks. If you’re interested in a deeper look at how these mechanics are implemented, this article on location based ar explains the development process and practical use cases really well. Ultimately, geo-tagging makes AR worlds feel persistent and meaningful, rather than just random overlays.
Thank you Niantic for all the amazing tools and framework features. After 3 years of using this amazing platform, I carried out my PhD and successfully completed.
My thesis explores how LBARGs can connect remote players and locations by designing and evaluating AR game experiences. The research begins with the development of an LBARG that visualizes remote spaces using different AR perspectives, offering insights into how these modes affect spatial presence and immersion across distance. Building on these findings, we developed an AR hide-and-seek game that compares remote AR gameplay with co-located play, examining how remote interactions influence player engagement, spatial decision-making, and overall gameplay experience. Finally, based on the findings of that study, we designed an improved version of the multiplayer hide-and-seek game, introducing five interactive core AR game mechanics. This study helps to understand how such AR game mechanics foster game engagement and influence players’ spatial decision-making strategies.
If anyone interested, here is the link to my work: Designing multiplayer location-based augmented reality games that connect remote players and places.