Skip to content

PixelBot is an advanced proof-of-concept Aim Assist application for Android devices. (no root required)

Notifications You must be signed in to change notification settings

DestroyerDarkNess/PixelBot

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PixelBot

platform frameworks engine

Universal Screen-Capture Aim Assist for Android featuring Computer Vision and Machine Learning.

-----------------------------------------------------

Table of Contents

-----------------------------------------------------

Overview

PixelBot is an advanced proof-of-concept Aim Assist application for Android devices.

It utilizes the MediaProjection API to capture the screen at high frame rates, cropping the video stream to a specific Field of View (FOV) around the user's crosshair. It then processes these frames in real-time using either Color Thresholding algorithms or TensorFlow Lite object detection models to pinpoint targets. Once a target is locked, PixelBot emulates precise swipe gestures to aim the camera automatically.

-----------------------------------------------------

Project Structure

Component Description
ScreenCaptureManager Manages the MediaProjection Virtual Display, capturing the exact pixels within the custom FOV radius.
TargetDetector Analyzes cropped FOV Bitmaps using specific mathematical algorithms or Neural Networks to calculate the X/Y coordinate of the enemy.
AimGestureService Accessibility Service responsible for taking the X/Y delta and computing a smooth, 60fps tracking swipe gesture towards the target.
AimOverlayService Floating Window application managing the in-game UI, transparent overlays, and diagnostic telemetry consoles.

-----------------------------------------------------

Features

  • Dynamic FOV Settings: Real-time adjustable Field of View radius and X/Y offset to match asymmetrical game UI layouts.
  • Multiple Detection Engines: Hot-swappable algorithms for detection (Hue tracking, RGB Variance, TensorFlow AI).
  • Virtual Joystick Bounds: Clamps the maximum distance of automated swipes to a predefined "Virtual Joystick" area, preventing the OS from rejecting massive screen-wide leaps.
  • Smooth Routing Physics: Interpolates target vectors over a user-defined timescale duration via a chain of 20ms micro-swipes, creating human-like ease-in tracking.
  • In-Game Overlay UI: Draggable, transparent settings menu built with WindowManager parameters to prevent touch-blocking.
  • Real-time Debug Console: Built-in floating console providing ms-latency and X/Y resolution metrics without requiring Logcat.

-----------------------------------------------------

Targeting Algorithms & AI

PixelBot features three selectable identification presets via the in-game menu:

  1. Simple Red Dominance: A lightweight, high-performance algorithm that tracks pure red pixels ignoring blue and green channels. Optimal for simple highlighted targets.
  2. Dynamic Hue Targeting: Converts the RGB FOV to HSV, allowing the user to select a precise color degree (0-360) and tolerance.
  3. AI Person Detection (TensorFlow Lite):
    • Utilizes ssd_mobilenet_v1_1_metadata_1.tflite for human identification.
    • Designed for games with complex environments where rudimentary color tracking produces false positives.
    • Includes an adjustable "Confidence Threshold" slider to balance accuracy vs. detection speed.

-----------------------------------------------------

Build

Requirements

  • Windows
  • Java Development Kit (JDK 21)
  • Android SDK Build Tools

Build command

Execute the included batch script from the project root. This handles Gradle assembly and signing the debug APK.

.\build.bat

Output

After build execution, the packaged Android binary is copied to:

output\PixelBot-debug.apk

-----------------------------------------------------

Current Limitations

The Accessibility Multi-Touch Bug

Currently, the gesture injection component (AimGestureService) runs via the Android AccessibilityService. When AccessibilityService.dispatchGesture() fires a virtual swipe, the Android Input Framework intentionally dispatches an ACTION_CANCEL event to all active physical touches on the digitizer.

If a user is physically pressing the "Fire" button, and PixelBot detects a target, the resulting micro-swipe will cancel the user's thumb input. This forces the player to release and re-press the screen to continue firing.

-----------------------------------------------------

Roadmap (Shizuku)

To resolve the Accessibility cancellation bug without requiring full kernel Root, the gesture architecture must be migrated to Shizuku.

By routing injection commands through Shizuku:

  • PixelBot bypasses the AccessibilityService safeguards.
  • Touch coordinates can be pushed natively to InputManager or /dev/uinput.
  • The Android OS processes both the physical thumb touch (firing) and the automated swipe (aiming) simultaneously, enabling uninterrupted multi-touch tracking.

About

PixelBot is an advanced proof-of-concept Aim Assist application for Android devices. (no root required)

Resources

Stars

Watchers

Forks

Packages

No packages published