I might even go so far to say that an app is really about the hardware and that makes sense because an app that is just a read-only screen doesn’t buy you much. There’s plenty of data browsers. Now, look at TAK. Your not just read-only, your also a set of eyes and sensors that is feeding situational awareness into the system. Ultimate transparency of anything you could call foggy ‘metadata’ may have to wait for the augmented reality visor, but even the cameras, gyro, gps location, face to face meetings and sharing unexpected things as they come up. Your ‘node’ is a situational awareness peer. That’s a much different goal than an app that is, actually, trying to teach you a new way to comprehend it all. Just sayin’. You get that peer capability with some apps because they keep the metadata out of the way.
In principle, what you write on the morning board and stuff that’s on the bulletin board you walk, or run, by, is stuff you’ve got on the phone. What sort of vegetation is down in that ravine? Is there heat in that tree? In that structure? Is the block evacuated, or not? Somebody down at the crossroads says there is a horse in their yard, can you open the gate? Has anybody seen the horse, lately?
This is stuff you’d like. Hold the tablet up to the scene and ask, “is this the house, ma’am?” Doesn’t need a lot of setup, prep and briefing. So, I guess if you want a technology, start with the sensors, avoid the middleware, and let the datagraph sort itself out.