Recently, in order to prepare some demos, I played a bit with mobile + Forge Viewer. I was thinking about the scenario which gets data of mobile sensors/touch behavior and emits them to Forge Viewer. It will be tedious to me if setup a native app of mobile, so I tried to check the possibility if JavaScript could do the job. Fortunately, such technologies are quite ready. searching by ‘javascript mobile sensor’ on internet, you will find quite a lot of relevant articles.
So, I played with something: Gyro sensor, Touch, Camera / Photo. In this article, I will show the demos on the Gyro sensor and Touch, and show Camera / Photo in the next blog.
Actually, the core code is very straightforward, e.g. to monitor Gyro sensor, the relevant codes would be:
While for touch, it is similar to the process of mouse with PC. The relevant events are: TouchStart = MouseDown; TouchEnd = MouseUp; TouchMove = MouseMove.
The only thing you would need to be aware is on mobile, the touch could be 1 finger or several fingers. So you might need to get the corresponding finger touch from the collection: evt.changedTouches.
To have fun, I applied the workflow to two demos:
1. Remotely move an object within Forge Viewer by the data of Gyro sensor. The code draws a cylinder in the viewer by Three.js and monitors the data from socket. When the data is changed, the cylinder will move with the three values of Gyro.
2. Remotely navigate in the viewer. The web page on mobile will send data of Gyro sensor or touch distance to socket. The main page monitors the data from socket and translate Gyro value to the rotate angles of the viewer. And also translate the touch distance to the distance of moving forward/backward (walking).
The whole sample codes are available at:
Comments