Universal Screen-Capture Aim Assist for Android featuring Computer Vision and Machine Learning.
- Overview
- Project Structure
- Features
- Targeting Algorithms & AI
- Build
- Current Limitations
- Roadmap (Shizuku)
PixelBot is an advanced proof-of-concept Aim Assist application for Android devices.
It utilizes the MediaProjection API to capture the screen at high frame rates, cropping the video stream to a specific Field of View (FOV) around the user's crosshair. It then processes these frames in real-time using either Color Thresholding algorithms or TensorFlow Lite object detection models to pinpoint targets. Once a target is locked, PixelBot emulates precise swipe gestures to aim the camera automatically.
| Component | Description |
|---|---|
ScreenCaptureManager |
Manages the MediaProjection Virtual Display, capturing the exact pixels within the custom FOV radius. |
TargetDetector |
Analyzes cropped FOV Bitmaps using specific mathematical algorithms or Neural Networks to calculate the X/Y coordinate of the enemy. |
AimGestureService |
Accessibility Service responsible for taking the X/Y delta and computing a smooth, 60fps tracking swipe gesture towards the target. |
AimOverlayService |
Floating Window application managing the in-game UI, transparent overlays, and diagnostic telemetry consoles. |
- Dynamic FOV Settings: Real-time adjustable Field of View radius and X/Y offset to match asymmetrical game UI layouts.
- Multiple Detection Engines: Hot-swappable algorithms for detection (Hue tracking, RGB Variance, TensorFlow AI).
- Virtual Joystick Bounds: Clamps the maximum distance of automated swipes to a predefined "Virtual Joystick" area, preventing the OS from rejecting massive screen-wide leaps.
- Smooth Routing Physics: Interpolates target vectors over a user-defined timescale duration via a chain of 20ms micro-swipes, creating human-like ease-in tracking.
- In-Game Overlay UI: Draggable, transparent settings menu built with
WindowManagerparameters to prevent touch-blocking. - Real-time Debug Console: Built-in floating console providing ms-latency and X/Y resolution metrics without requiring Logcat.
PixelBot features three selectable identification presets via the in-game menu:
- Simple Red Dominance: A lightweight, high-performance algorithm that tracks pure red pixels ignoring blue and green channels. Optimal for simple highlighted targets.
- Dynamic Hue Targeting: Converts the RGB FOV to HSV, allowing the user to select a precise color degree (0-360) and tolerance.
- AI Person Detection (TensorFlow Lite):
- Utilizes
ssd_mobilenet_v1_1_metadata_1.tflitefor human identification. - Designed for games with complex environments where rudimentary color tracking produces false positives.
- Includes an adjustable "Confidence Threshold" slider to balance accuracy vs. detection speed.
- Utilizes
- Windows
- Java Development Kit (JDK 21)
- Android SDK Build Tools
Execute the included batch script from the project root. This handles Gradle assembly and signing the debug APK.
.\build.batAfter build execution, the packaged Android binary is copied to:
output\PixelBot-debug.apk
Currently, the gesture injection component (AimGestureService) runs via the Android AccessibilityService.
When AccessibilityService.dispatchGesture() fires a virtual swipe, the Android Input Framework intentionally dispatches an ACTION_CANCEL event to all active physical touches on the digitizer.
If a user is physically pressing the "Fire" button, and PixelBot detects a target, the resulting micro-swipe will cancel the user's thumb input. This forces the player to release and re-press the screen to continue firing.
To resolve the Accessibility cancellation bug without requiring full kernel Root, the gesture architecture must be migrated to Shizuku.
By routing injection commands through Shizuku:
- PixelBot bypasses the
AccessibilityServicesafeguards. - Touch coordinates can be pushed natively to
InputManageror/dev/uinput. - The Android OS processes both the physical thumb touch (firing) and the automated swipe (aiming) simultaneously, enabling uninterrupted multi-touch tracking.
