This project was an exploration into the use of Android sensors, gesture detection, and UI development via Jetpack Compose. The first screen showcases data obtained from 2 sensors and another section of data obtained via use of location services. The second screen detects gestures performed on the first half of the screen and then displays what gestures were done, as well as update the position of the ball.
The following required functionality is completed:
- User sees their location, the ambient temperature, and the air pressure senses by the phone.
- User can drag the ball and move it within the gesture playground area.
- User sees a list of gestures they have performed as they do them.
The following extensions are implemented:
- ...
Here's a walkthrough of implemented user stories:
Dealing with Jetpack Compose took an extensive period of time since in order to develop UI in it, it is a distinct way of thinking about layout as opposed to XML. Also, publishing has run into issues.
Copyright [2023] [Kenneth Harper]
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
