AR Navigation
Part 1: Setting up AR
This guide walks through the process of setting up an AR View, configuring and sending data to the Niantic SDK, and placing content on a map using SwiftUI.
This sample follows the MVVM pattern: VPS2ARViewController handles display, VPS2ViewModel handles the data, and an ARManager wraps Niantic SDK AR session calls.
IMPORTANT: NSDK requires you to send the raw frame data for processing. This means we have to use AR View. Consequently, this means we have to draw AR in UIKit, and bridge it to SwiftUI.
Prerequisites
- Completed the previous howtos — you have a working
NSDKSessionandAuthRetryHelperready to use. - iOS 17+, SwiftUI, MapKit.
- The
NSDKframework is linked to your target.
Files Overview
| File | Role |
|---|---|
VPS2ARViewController.swift | UIKit AR View and overlay UI |
VPS2ARViewControllerRepresentable | SwiftUI wrapper for VPS2ARViewController |
VPS2ViewModel.swift | Data model |
ARManager.swift | Listener class for NSDKSession and NSDKView |
VPS2MapView.swift | Map card overlay view on top of VPS2ARViewController |
1. Setting up the AR View
To display AR, we will need to create a UIViewController-derived class that will serve as the base for rendering the AR view, tracking VPS2 status, and displaying additional content. This will be paired with a VPS2ViewModel to listen for the VPS2Session's Localization events and track the state of the VPS2 anchors that AR content relies on.
Create the View Controller
The VPS2ARViewController is the main view handling the display of AR content, and UI elements overlaid on top. Create the base view controller class, including the initialization and setup functions expected for a UIViewController subclass:
Source: VPS2ARViewController.swift
final class VPS2ARViewController: UIViewController, CLLocationManagerDelegate {
private let siteName: String
private let anchorPayload: String
private let siteCoordinate: CLLocationCoordinate2D
private let arManager: ARManager
private var vps2Session: NSDKVps2Session!
private var viewModel: VPS2ViewModel!
init(arManager: ARManager, siteName: String, anchorPayload: String, siteCoordinate: CLLocationCoordinate2D) {
self.arManager = arManager
self.siteName = siteName
self.anchorPayload = anchorPayload
self.siteCoordinate = siteCoordinate
super.init(nibName: nil, bundle: nil)
}
override func viewDidLoad() {
super.viewDidLoad()
arManager.nsdkView.setup(in: view)
viewModel = VPS2ViewModel(session: vps2Session, frameState: arManager.frameState)
}
}
Create the View Model
The AR view needs to do two things once the user has selected a site:
- Place AR content in 3D space — render a destination marker and navigation arrows anchored to the VPS2 site in the real world.
- Show a 2D map overlay — display the user's current position and the site's fixed position on a miniature map.
We use VPS2ViewModel as the bridge between NSDKVps2Session's Combine publishers and the UI. It subscribes to anchor and localization updates from the session and exposes the results as @Published properties that the view controller and map overlay can observe.
Create the VPS2ViewModel class with the state it needs to track:
Source: VPS2ViewModel.swift
/// Binds `NSDKVps2Session` Combine publishers and `ARFrameState` into UI-ready state.
final class VPS2ViewModel {
private(set) var poiAnchorId: NSDKVpsAnchorId?
@Published private(set) var showMap: Bool = false
@Published private(set) var userMapCoordinate: CLLocationCoordinate2D?
@Published private(set) var userMapHeading: CGFloat?
@Published private(set) var anchorWorldTransform: simd_float4x4?
@Published private(set) var anchorTrackingState: VpsAnchorUpdate.AnchorTrackingState?
private let session: NSDKVps2Session
}
The published properties cover the AR-space anchor transform and tracking state (for 3D rendering), and the user's map coordinate and heading (for the 2D overlay). showMap controls whether the map is visible at all, and is driven by the anchor's tracking state.
2. Communicating with NSDK (ARManager, DataSource)
We set up the configuration and message handling for the NSDKSession in a dedicated ARManager class. This class serves as the view delegate for the ARSession's updates, receiving the updated AR frame data and passing it along to the NSDKSession for use in other classes such as the VPS2ARViewController. One thing to note here is that we capture the current device orientation. This is important because depending on whether a user is holding their device upright (such as when using the camera) or horizontal (as when viewing a map), we will need the current orientation in order to determine an accurate heading.
The following shows how the ARManager configures the data for the NSDKSession, and creates the function calls to start and stop AR processing:
Source: ARManager.swift
class ARManager: NSObject, UIOrientationReporter {
let nsdkSession: NSDKSession
let nsdkView: NSDKView
let frameState = ARFrameState()
var currentOrientation: NSDKScreenOrientation {
let orientation = nsdkView.window?.windowScene?.interfaceOrientation ?? .unknown
return NSDKScreenOrientation(orientation)
}
private var dataSource: NSDKSessionDataSource?
init(nsdkSession: NSDKSession) {
self.nsdkSession = nsdkSession
if let dataset = BundlePlaybackDatasetLoader(directory: "").loadDataset() {
nsdkView = NSDKView(dataset: dataset)
} else {
nsdkView = NSDKView()
}
super.init()
setupDataSource()
nsdkView.delegate = self
}
}
Setting up the Data Source
In order to use NSDK, we must provide it an AR data source. NSDKSession relies on receiving regular data from the ARSession's update in order to provide VPS2 functionality, and this data source will be different depending on whether we are using an NSDK PlaybackSession versus capturing live AR data. The following function in ARManager will set up the appropriate DataSource class depending on the type of AR session, and assign that data source to NSDKSession:
Source: ARManager.swift
private func setupDataSource() {
if let playbackSession = nsdkView.session as? PlaybackSession {
dataSource = PlaybackSessionDataSource(session: playbackSession)
} else {
dataSource = DefaultSessionDataSource(
session: nsdkView.session,
orientationReporter: self
)
}
nsdkSession.dataSource = dataSource
}
Starting and Stopping the NSDKSession
Add the following functions to ARManager to allow our VPS2ARViewController to start and stop the NSDKSession:
Source: ARManager.swift
func startSession() {
let config = ARWorldTrackingConfiguration()
config.planeDetection = [.horizontal, .vertical]
if ARUtils.isLidarAvailable() {
config.frameSemantics.insert(.sceneDepth)
}
nsdkView.session.run(config)
setupLighting()
UIApplication.shared.isIdleTimerDisabled = true
}
func stopSession() {
nsdkView.session.pause()
UIApplication.shared.isIdleTimerDisabled = false
}
private func setupLighting() {
nsdkView.environment.lighting.intensityExponent = 1.0
let light = DirectionalLight()
light.light.intensity = 1000
light.orientation = simd_quatf(
angle: .pi / 4,
axis: simd_normalize([1, 1, 0])
)
let lightAnchor = AnchorEntity(world: .zero)
lightAnchor.addChild(light)
nsdkView.scene.addAnchor(lightAnchor)
}
private func handleFrameUpdate() {
nsdkSession.update()
frameState.update(camera: nsdkView.getCamera())
}
Handling AR View Delegate Callbacks
ARManager also serves as the delegate for the NSDKView, and handles the callbacks for the AR frame updates. We need to notify NSDK when a new AR frame is made available. Add the following to ARManager:
Source: ARManager.swift
extension ARManager: NSDKViewDelegate {
func session(_ session: ARSession, didUpdate frame: ARFrame) {
nsdkSession.update()
frameState.update(camera: nsdkView.getCamera())
}
func playbackSession(_ session: PlaybackSession, didUpdate frame: PlaybackFrame) {
nsdkSession.update()
frameState.update(camera: nsdkView.getCamera())
}
func session(_ session: ARSession, cameraDidChangeTrackingState camera: ARCamera) {
frameState.update(trackingState: camera.trackingState)
}
func playbackSession(
_ session: PlaybackSession,
cameraDidChangeTrackingState camera: PlaybackCamera
) {
frameState.update(trackingState: camera.trackingState)
}
}
Starting VPS2 Localization
To localize to a site, we need to call trackAnchor() on the VPS2Session. When we chose a VPS2 location in SiteMapView and launched the AR view, we obtained the site name, latitude and longitude coordinates, and VPS2 anchor payload. We will use the anchor payload data to begin the tracking process and start localizing.
In VPS2ARViewController add a function to start localization and begin tracking of the chosen VPS2 anchor:
Source: VPS2ARViewController.swift
private func startLocalization() {
updateInfoLabel(text: "Localizing...")
Task { @MainActor in
do {
try await retryHelper.withRetry { [weak self] in
guard let self else { return }
let id = try self.vps2Session.trackAnchor(payload: self.anchorPayload)
self.viewModel.setPoiAnchorId(id)
}
} catch {
print("Failed to track anchor. Error: \(String(describing: error))")
}
}
}
Getting Device Geolocation
As the NSDKVps2Session updates, we can track those changes to update map data accordingly. Add the following to the VPS2ViewModel's init():
Source: VPS2ViewModel.swift
session.$latestLocalization
.combineLatest(frameState.$camera)
.receive(on: DispatchQueue.main)
.sink { [weak self] localization, camera in
self?.updateMapIndicators(localization: localization, camera: camera)
}
.store(in: &cancellables)
And add the associated function that converts the VPS2 localization data into map coordinates and heading:
Source: VPS2ViewModel.swift
// MARK: - Map
private func updateMapIndicators(
localization: Vps2Localization?,
camera: NSDKCamera?
) {
guard
let localization,
localization.trackingState != .unavailable,
let camera
else {
userMapCoordinate = nil
userMapHeading = nil
return
}
let orientation = UIDevice.current.orientation
let heading: HeadingMode = (orientation == .faceUp || orientation == .faceDown) ? .deviceTop : .cameraDirection
guard let userGeo = session.deviceGeolocation(headingMode: heading) else { return }
userMapCoordinate = CLLocationCoordinate2D(
latitude: userGeo.geolocationData.latitude,
longitude: userGeo.geolocationData.longitude
)
userMapHeading = CGFloat(userGeo.geolocationData.heading)
}
Storing Site Geolocation
Unlike the device's position, the site's coordinates never change — they are already known from the site selection screen. We configure this in the init() function of VPS2ARViewController:
Source: VPS2ARViewController.swift
mapData.poiCoordinate = siteCoordinate
3. Connecting the AR View to SwiftUI
SwiftUI provides several methods for UIKit-derived classes to interact with its navigation stack and vice versa. Because VPS2ARViewController was set up as a UIKit class in order to use UIKit's AR functionality, we need to set up a wrapper class so it can be added to the outer SwiftUI View stack. This is done through SwiftUI's UIViewControllerRepresentable protocol.
Create VPS2ARViewControllerRepresentable:
Source: VPS2ARViewControllerRepresentable.swift
/// Wraps `VPS2ARViewController` for use in SwiftUI navigation.
struct VPS2ARViewControllerRepresentable: UIViewControllerRepresentable {
let arManager: ARManager
let siteName: String
let anchorPayload: String
let siteCoordinate: CLLocationCoordinate2D
func makeUIViewController(context: Context) -> VPS2ARViewController {
VPS2ARViewController(
arManager: arManager,
siteName: siteName,
anchorPayload: anchorPayload,
siteCoordinate: siteCoordinate
)
}
func updateUIViewController(_ uiViewController: VPS2ARViewController, context: Context) {}
}
Adding VPS2ARViewController to the SwiftUI View:
SwiftUI's UIViewControllerRepresentable allows you to create a UIKit UIViewController-derived object and manage it within the SwiftUI interface. The following function creates the wrapped VPS2ARViewController and adds it to the SwiftUI's navigation stack:
Source: LandingView.swift
@ViewBuilder
private func destinationARView(siteName: String, payload: String, coordinate: CLLocationCoordinate2D, nsdkSession: NSDKSession) -> some View {
let arManager = ARManager(nsdkSession: nsdkSession)
VPS2ARViewControllerRepresentable(
arManager: arManager,
siteName: siteName,
anchorPayload: payload,
siteCoordinate: coordinate
)
.navigationTitle(siteName)
.navigationBarTitleDisplayMode(.inline)
.ignoresSafeArea()
.navigationViewStyle(.stack)
}
This can then be added as part of the destination selection in the main View's NavigationStack:
Source: LandingView.swift
// Navigation Destinations
.navigationDestination(for: NavigationStep.self) { step in
switch step {
case .map(let viewModel, let nsdkSession):
self.destinationSiteMapView(viewModel: viewModel, nsdkSession: nsdkSession)
case .ar(let siteName, let payload, let coordinate, let nsdkSession):
self.destinationARView(siteName: siteName, payload: payload, coordinate: coordinate, nsdkSession: nsdkSession)
}
}
4. Setting up the Map view on top of the AR View
Combining the 3d AR view with a 2d Map display view is another case that requires UIKit and SwiftUI to communicate with one another. Here we are creating a VPS2MapView using SwiftUI that will be rendered on top of the UIKit view created for VPS2ARViewController. This example leverages the existing functionality provided by MapKit and CoreLocation to create and display the map as well as placing content onto it.
Update VPS2ARViewController to handle changes to the Map Data
The VPS2ARViewController will respond to changes in the data coming from VPS2ViewModel and pass those changes along to a separate view for displaying the map. Before we can do that, we need to set up some initial handling of map-specific data. Add the following functions and variables where indicated:
Source: VPS2ARViewController.swift
// Add a new variable to the class:
private let mapData = VPS2MapData()
// In viewDidLoad(), add the initial calls to set up the map data and pass in the current tracked coordinates:
setupMapOverlay()
mapData.poiCoordinate = siteCoordinate
bindMapFromViewModel()
// In viewWillAppear(), we want to flag the map as being visible initially:
mapData.isVisible = true
// In viewWillDisappear(), flag the map as not being visible:
mapData.isVisible = false
// Also add these functions for handling map data:
private func bindMapFromViewModel() {
Publishers.CombineLatest(viewModel.$userMapCoordinate, viewModel.$userMapHeading)
.receive(on: DispatchQueue.main)
.sink { [weak self] coord, heading in
self?.applyUserMapLocation(vpsCoordinate: coord, vpsHeading: heading)
}
.store(in: &cancellables)
}
private func applyUserMapLocation(
vpsCoordinate: CLLocationCoordinate2D?,
vpsHeading: CGFloat?
) {
if let coord = vpsCoordinate {
mapData.userCoordinate = coord
mapData.userHeading = vpsHeading.map { Double($0) }
}
}
Creating VPS2MapView
We will display the actual Map content using a separate view, VPS2MapView. The following creates an overlay map view using MapKit that can be used by VPS2ARViewController to show the user's location as well as the location of the chosen VPS2 location:
Source: VPS2MapView.swift
/// Observable data bridging VPS2ViewModel's Combine publishers to the SwiftUI map overlay.
@Observable
final class VPS2MapData {
var userCoordinate: CLLocationCoordinate2D?
var userHeading: Double?
var poiCoordinate: CLLocationCoordinate2D?
var isVisible: Bool = false
}
/// Miniature map overlay for the VPS2 AR screen showing user and POI locations.
///
/// - Minimized: small card in the bottom-right corner. Tap to expand.
/// - Expanded: near-full-screen with close button and full map interaction.
/// Tap outside or close button to collapse.
struct VPS2MapView: View {
var data: VPS2MapData
@State private var isExpanded = false
@State private var showCloseButton = false
@State private var cameraPosition: MapCameraPosition = .automatic
var body: some View {
GeometryReader { geometry in
ZStack(alignment: .bottomTrailing) {
if isExpanded {
Color.black.opacity(0.001)
.frame(maxWidth: .infinity, maxHeight: .infinity)
.onTapGesture { collapse() }
}
if data.isVisible {
mapCard(in: geometry)
}
}
.frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .bottomTrailing)
}
}
}
Creating the MapCard
The actual display of the Map itself will be rendered as a small card in the corner of the screen, with the option to expand it by tapping on it. Add the following functions to the VPS2MapView struct to create the Map display itself, and hook up the expand/collapse functionality:
Source: VPS2MapView.swift
// MARK: - Map Card
private func mapCard(in geometry: GeometryProxy) -> some View {
Map(position: $cameraPosition, interactionModes: isExpanded ? .all : [])
.mapStyle(.standard)
.clipShape(RoundedRectangle(cornerRadius: 20))
.overlay(alignment: .topTrailing) {
if showCloseButton {
closeButton
}
}
.contentShape(RoundedRectangle(cornerRadius: 20))
.onTapGesture {
if !isExpanded { expand() }
}
.frame(
width: isExpanded ? geometry.size.width - 40 : geometry.size.width / 3,
height: isExpanded ? geometry.size.height - 40 : geometry.size.height * 0.25
)
.padding(.trailing, 20)
.padding(.bottom, isExpanded ? 20 : 70)
.shadow(color: .black.opacity(0.3), radius: isExpanded ? 10 : 4)
}
private var closeButton: some View {
Button { collapse() } label: {
Image(systemName: "xmark")
.font(.system(size: 14, weight: .bold))
.foregroundStyle(.white)
.frame(width: 36, height: 36)
.background(.black.opacity(0.5))
.clipShape(Circle())
}
.padding(12)
}
// MARK: - State Transitions
private func expand() {
withAnimation(.easeInOut(duration: 0.35)) {
isExpanded = true
} completion: {
showCloseButton = true
}
}
private func collapse() {
showCloseButton = false
withAnimation(.easeInOut(duration: 0.35)) {
cameraPosition = .automatic
isExpanded = false
}
}
Instantiating the VPS2MapView
UIHostingController is similar to what we did previously with the UIViewControllerRepresentable: it allows for SwiftUI functionality to be incorporated into a UIKit view hierarchy. In this case, we are creating a map overlay to place on top of the AR View. Add the following function to VPS2ARViewController to create the VPS2MapView and add it to the view hierarchy as a subview:
Source: VPS2ARViewController.swift
private func setupMapOverlay() {
let hostingController = UIHostingController(rootView: VPS2MapView(data: mapData))
hostingController.view.backgroundColor = .clear
hostingController.view.translatesAutoresizingMaskIntoConstraints = false
addChild(hostingController)
view.addSubview(hostingController.view)
NSLayoutConstraint.activate([
hostingController.view.topAnchor.constraint(equalTo: view.topAnchor),
hostingController.view.leadingAnchor.constraint(equalTo: view.leadingAnchor),
hostingController.view.trailingAnchor.constraint(equalTo: view.trailingAnchor),
hostingController.view.bottomAnchor.constraint(equalTo: view.bottomAnchor),
])
hostingController.didMove(toParent: self)
}
5. Placing the site position on the map
In the VPS2ViewModel, the MapData.poiCoordinate contains the latitude and longitude of the VPS2 Site. This data is passed in when we initialized the VPS2ARViewController and began localizing in startLocalization. From this data, we can create markers on the map at these respective locations.
Creating a map Annotation
MapKit provides the Annotation object for SwiftUI that lets users place content on a map. We can use this to display a point on the map provided by the ViewModel, representing both the user's location and the location of the VPS2 PoI.
The following snippet creates a LocationIndicator object that can be passed to the Annotation constructor. This allows us to place a marker on the map, with an optional indicator for displaying heading:
Source: VPS2MapView.swift
private struct LocationIndicator: View {
var color: Color
var heading: Double? = nil
var body: some View {
ZStack {
if let heading {
HeadingCone()
.fill(color.opacity(0.2))
.frame(width: 60, height: 60)
.rotationEffect(.degrees(heading))
}
Circle()
.fill(color)
.frame(width: 10, height: 10)
}
}
}
private struct HeadingCone: Shape {
func path(in rect: CGRect) -> Path {
let center = CGPoint(x: rect.midX, y: rect.midY)
var path = Path()
path.move(to: center)
path.addLine(to: CGPoint(x: center.x - 12, y: center.y - 30))
path.addLine(to: CGPoint(x: center.x + 12, y: center.y - 30))
path.closeSubpath()
return path
}
}
Adding the Annotation to the MapCard
This LocationIndicator can then be added to the map as part of an Annotation. When creating the Map View as part of the mapCard function, we can check if the map data contains user or PoI coordinates. If so, add an Annotation at those coordinates, passing in the LocationIndicator as content:
Source: VPS2MapView.swift
private func mapCard(in geometry: GeometryProxy) -> some View {
Map(position: $cameraPosition, interactionModes: isExpanded ? .all : []) {
if let coord = data.userCoordinate {
Annotation("", coordinate: coord) {
LocationIndicator(color: .blue, heading: data.userHeading)
}
}
if let coord = data.poiCoordinate {
Annotation("", coordinate: coord) {
LocationIndicator(color: .red)
}
}
}
.mapStyle(.standard)
.clipShape(RoundedRectangle(cornerRadius: 20))
.overlay(alignment: .topTrailing) {
if showCloseButton {
closeButton
}
}
.contentShape(RoundedRectangle(cornerRadius: 20))
.onTapGesture {
if !isExpanded { expand() }
}
.frame(
width: isExpanded ? geometry.size.width - 40 : geometry.size.width / 3,
height: isExpanded ? geometry.size.height - 40 : geometry.size.height * 0.25
)
.padding(.trailing, 20)
.padding(.bottom, isExpanded ? 20 : 70)
.shadow(color: .black.opacity(0.3), radius: isExpanded ? 10 : 4)
}
Next Steps
With an AR View in place and a VPS2 anchor to work from, we can now begin placing 3d content into the scene. See advanced AR effects and precise localization for further information.