@flarb that always felt like the biggest problem to me; too slow on basic features, and couldn't polyfill. (Same problem when their vr then xr abstraction turned up... it was way behind sdks and only covered say, one headset instead of the 5 we needed).
Unity needs to allow users to implement low level implementations to work with their abstraction
@Soylentgraham I think now there are enough platforms trying to do the same thing (Meta Presence Platform, Snapdragon Spaces, Magic Leap, HoloLens, etc.) that there may be some movement on this. The tricky part will probably be how each platform does the equivalent of Microsoft's old Scene Understanding API--I can't see a one size fits all approach for all hardware on that one.
@flarb yeah, which is why people should be able to make their own polyfill for features, how they want it for their app. Meanwhile the game side uses the clean api still!
Instead the messy hacky extra abstraction intertwines with game code (which should only live on top of the abstraction)
@flarb it should be in a package! :)
@Soylentgraham the Snapdragon Spaces SDK is!
@flarb hmmm! Im gonna have to see if it really is underneath ARFoundation then, will be pleased if we can start using unity's abstractions
@Soylentgraham You might be right--but I could have sworn from reading the Snapdragon docs, Qualcomm says they implement the low level details of the interfaces themselves? I think you can actually open the ARFoundation classes and see Qualcomm's own code in them.