@flarb that always felt like the biggest problem to me; too slow on basic features, and couldn't polyfill. (Same problem when their vr then xr abstraction turned up... it was way behind sdks and only covered say, one headset instead of the 5 we needed).
Unity needs to allow users to implement low level implementations to work with their abstraction
@Soylentgraham I think now there are enough platforms trying to do the same thing (Meta Presence Platform, Snapdragon Spaces, Magic Leap, HoloLens, etc.) that there may be some movement on this. The tricky part will probably be how each platform does the equivalent of Microsoft's old Scene Understanding API--I can't see a one size fits all approach for all hardware on that one.
@flarb yeah, which is why people should be able to make their own polyfill for features, how they want it for their app. Meanwhile the game side uses the clean api still!
Instead the messy hacky extra abstraction intertwines with game code (which should only live on top of the abstraction)
@Soylentgraham I'm using the Snapdragon Spaces implementation, and they basically use AR Foundation but then provide their own APIs for missing features--such as hand tracking. But I'm assuming it would be possible to access platform-specific features in plane tracking etc. since I guess the vendor implements ARFoundation classes themselves?
@flarb hmm i assume unity implements the abstraction, otherwise we could all make our own vendoring? (Hence why it falls behind :)
@flarb hmmm! Im gonna have to see if it really is underneath ARFoundation then, will be pleased if we can start using unity's abstractions
@Soylentgraham the Snapdragon Spaces SDK is!