Close
0%
0%

Virtual Steering Wheel!

Control racing games with just your hands using AI! Tilt to steer, thumbs to drive or brake—no controller needed. Just a webcam!

Public Chat
Similar projects worth following
YouTube Video: https://youtu.be/SWBh5BFjK2I?si=xGxdg_S0Dxw-8xy5

📜 Project Description:
This project transforms your webcam into a smart steering wheel using hand gesture recognition. By tracking both hands with MediaPipe and analyzing their positions and angles, we simulate accelerator (W), brake (S), left (A), and right (D) keypresses in racing games like Forza Horizon.

With no extra hardware—just a laptop and a webcam—this virtual steering system brings a hands-in-the-air gaming experience. It’s perfect for DIYers, gamers, or educators curious about computer vision and gesture control!

Using MediaPipe's hand landmarks, we detect when your left thumb is down (accelerator), right thumb is down (brake), and calculate the angle between your hands to determine whether to steer left, right, or go straight.
You can game just by moving your hands in the air—how cool is that?

What Is This Project?

Imagine you're playing a car racing game… but instead of using a joystick or keyboard, you just move your hands in front of a camera to drive the car! This project makes that possible using a webcamPython.

You can:

  • 👐 Turn your hands left or right to steer the car.
  • 👍 Move your thumb down to speed up or slow down. It’s like magic! But it’s actually made using something called Computer Vision.

This project is designed for complete beginners and fun enthusiasts who want to control a car using hand gestures with a webcam. The program uses MediaPipe to track your hand, calculates the angle between your wrist, thumb, and index finger, and based on that angle, it sends keyboard commands to the game.

No fancy equipment needed—just a webcam, your hand, and a game that uses the keyboard (like pressing A/D to steer).

Let’s dive into how it works in the simplest way possible.

Supplies

Here’s everything you need:

  • ️ A laptop or desktop computer (Windows, macOS, or Linux)
  • A webcam (built-in or USB)
  • Python installed (version 3.7 or later)
  • Python packages:
  • opencv-python
  • mediapipe
  • pyautogui

steerfinal.py

experimental code

x-python - 4.41 kB - 04/23/2025 at 04:50

Download

  • 1
    Step 1: Install Python
    If you haven’t already: Go to https://www.python.org/ Click Download Python Install it like any other app
  • 2
    Step 2: Install the Libraries

    These are the special helpers for Python:

    1. Open Command Prompt (Windows) or Terminal (Mac/Linux)
    2. Type the following and press Enter:

    pip install opencv-python mediapipe pyautogui

    What Are These Libraries?

    Let’s understand them like superheroes:

    1. 🧠 Mediapipe – This smart tool sees your hands and finds points like your thumb, wrist, and fingers.
    2. 👀 OpenCV – It helps us use the webcam and draw things on the screen.
    3. ⌨️ PyAutoGUI – It presses keys like W, A, S, D for you.
  • 3
    Step 3: Copy the Code

    this took me days to figuring out and i am giving you the code for free!!!

    Here’s the magic program. You can open Notepad, or better: install VS Code.

    import cv2
    import mediapipe as mp
    import math
    import pyautogui
    import time
    
    mp_drawing = mp.solutions.drawing_utils
    mphands = mp.solutions.hands
    
    cap = cv2.VideoCapture(0)
    hands = mphands.Hands(static_image_mode=False, max_num_hands=2, min_detection_confidence=0.7, min_tracking_confidence=0.5)
    
    def calculate_angle(x1, y1, x2, y2):
        angle = math.degrees(math.atan2(y2 - y1, x2 - x1))
        if angle < 0:
            angle += 360
        if angle > 180:
            angle -= 360
        return angle
    
    def is_thumb_up(hand_landmarks):
        return hand_landmarks.landmark[4].y < hand_landmarks.landmark[3].y
    
    pressing_w = False
    pressing_s = False
    
    turn_sensitivity = 0.01
    straight_threshold = 6
    tap_duration = 0.01 # Still used, but for logic-based tap, not sleep
    
    # Tap state for A and D keys
    tap_a_time = 0
    tap_d_time = 0
    tap_in_progress_a = False
    tap_in_progress_d = False
    
    while True:
        ret, image = cap.read()
        if not ret:
            break
    
        image = cv2.resize(image, (480, 360))
        image = cv2.cvtColor(cv2.flip(image, 1), cv2.COLOR_BGR2RGB)
        results = hands.process(image)
        image = cv2.cvtColor(image, cv2.COLOR_RGB2BGR)
    
        current_time = time.time()
    
        if results.multi_hand_landmarks and len(results.multi_hand_landmarks) == 2:
            hand1 = results.multi_hand_landmarks[0]
            hand2 = results.multi_hand_landmarks[1]
    
            if hand1.landmark[0].x < hand2.landmark[0].x:
                left_hand = hand1
                right_hand = hand2
            else:
                left_hand = hand2
                right_hand = hand1
    
            x1, y1 = left_hand.landmark[0].x * image.shape[1], left_hand.landmark[0].y * image.shape[0]
            x2, y2 = right_hand.landmark[0].x * image.shape[1], right_hand.landmark[0].y * image.shape[0]
    
            mp_drawing.draw_landmarks(image, left_hand, mphands.HAND_CONNECTIONS)
            mp_drawing.draw_landmarks(image, right_hand, mphands.HAND_CONNECTIONS)
            cv2.line(image, (int(x1), int(y1)), (int(x2), int(y2)), (0, 255, 0), 2)
    
            thumb1_x, thumb1_y = left_hand.landmark[4].x * image.shape[1], left_hand.landmark[4].y * image.shape[0]
            thumb2_x, thumb2_y = right_hand.landmark[4].x * image.shape[1], right_hand.landmark[4].y * image.shape[0]
    
            cv2.putText(image, "Accelerator", (int(thumb1_x), int(thumb1_y) - 10), cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 255, 0), 1)
            cv2.putText(image, "Brake", (int(thumb2_x), int(thumb2_y) - 10), cv2.FONT_HERSHEY_SIMPLEX, 0.5, (0, 0, 255), 1)
    
            angle = calculate_angle(x1, y1, x2, y2)
            cv2.putText(image, f'{int(angle)} degrees', (10, 30), cv2.FONT_HERSHEY_SIMPLEX, 0.5, (255, 0, 0), 1)
    
            # Tap simulation for left and right
            if angle < -straight_threshold and not tap_in_progress_a:
                pyautogui.keyDown('a')
                tap_a_time = current_time
                tap_in_progress_a = True
                cv2.putText(image, "Left", (10, 210), cv2.FONT_HERSHEY_SIMPLEX, 0.7, (255, 255, 255), 2)
            elif angle > straight_threshold and not tap_in_progress_d:
                pyautogui.keyDown('d')
                tap_d_time = current_time
                tap_in_progress_d = True
                cv2.putText(image, "Right", (10, 210), cv2.FONT_HERSHEY_SIMPLEX, 0.7, (255, 255, 255), 2)
    
            # Release A key after tap_duration
            if tap_in_progress_a and (current_time - tap_a_time > tap_duration):
                pyautogui.keyUp('a')
                tap_in_progress_a = False
    
            # Release D key after tap_duration
            if tap_in_progress_d and (current_time - tap_d_time > tap_duration):
                pyautogui.keyUp('d')
                tap_in_progress_d = False
    
            # Acceleration
            if not is_thumb_up(left_hand):
                if not pressing_w:
                    pyautogui.keyDown('w')
                    pressing_w = True
            else:
                if pressing_w:
                    pyautogui.keyUp('w')
                    pressing_w = False
    
            # Braking
            if not is_thumb_up(right_hand):
                if not pressing_s:
                    pyautogui.keyDown('s')
                    pressing_s = True
            else:
                if pressing_s:
                    pyautogui.keyUp('s')
                    pressing_s = False
    
        cv2.imshow('Hand Tracker', image)
        if cv2.waitKey(1) & 0xFF == 27:
            break
    
    if pressing_w:
        pyautogui.keyUp('w')
    if pressing_s:
        pyautogui.keyUp('s')
    pyautogui.keyUp('a')
    pyautogui.keyUp('d')
    
    cap.release()
    cv2.destroyAllWindows()

View all 5 instructions

Enjoy this project?

Share

Discussions

Similar Projects

Does this project spark your interest?

Become a member to follow this project and never miss any updates