I checked the magnifier app on my iPhone 6s and couldn’t find anything related to narrating what it sees.
It’s very possible that recognition happens locally, Apple has been making efforts in this direction for other functions (photos) but maybe the 6s is just too old and can’t do it.
As far as I understand, the narration is handled by the ‘voiceover’ feature, which is a generic accessibility feature, precisely the one that reads screen contents and controls. In the demo, Kristy taps the screen each time to invoke the voiceover description.
iPhone 6S shipped with iOS 9 but is upgradable to the current 15.6. Though features may indeed depend on the particular phone, i.e. the hardware.
Also, apparently Voiceover changes the way the phone should be controlled, to its own set of gestures—so it's not a feature to use just with the camera. However, it sounds to me like blind users would find it helpful anyway.
It’s very possible that recognition happens locally, Apple has been making efforts in this direction for other functions (photos) but maybe the 6s is just too old and can’t do it.