Texture From Camera Activity
Direct the Camera preview to a GLES texture and manipulate it.
We manage the Camera and GLES rendering from a dedicated thread. We don't animate anything, so we don't need a Choreographer heartbeat -- just redraw when we get a new frame from the camera or the user has caused a change in size or position.
The Camera needs to follow the activity pause/resume cycle so we don't keep it locked while we're in the background. Also, for power reasons, we don't want to keep getting frames when the screen is off. As noted in http://source.android.com/devices/graphics/architecture.html#activity the Surface lifecycle isn't quite the same as the activity's. We follow approach #1.
The tricky part about the lifecycle is that our SurfaceView's Surface can outlive the Activity, and we can get surface callbacks while paused, so we need to keep track of it in a static variable and be prepared for calls at odd times.
The zoom, size, and rotate values are determined by the values stored in the "seek bars" (sliders). When the device is rotated, the Activity is paused and resumed, but the controls retain their value, which is kind of nice. The position, set by touch, is lost on rotation.
The UI updates go through a multi-stage process:
- The user updates a slider.
- The new value is passed as a percent to the render thread.
- The render thread converts the percent to something concrete (e.g. size in pixels). The rect geometry is updated.
- (For most things) The values computed by the render thread are sent back to the main UI thread.
- (For most things) The UI thread updates some text views.