Control GPIO using the new Linux user space GPIO API

From the version 4.8, the Linux kernel introduces a new user space API based on character devices for managing and controlling GPIOs ( General-Purpose Input/Output). This post presents the basic of the new interface as well as a simple tutorial/example to demonstrate how to use the new API to control GPIOs.

The hardware used in the tutorial is the Raspberry Pi 3B but the code is generic and can be used on any embedded hardware.

Meet Dolly the robot

Ladies and gentlemen, please meet "Dolly the robot", the first version of my DIY mobile robot. My goal in this DIY project is to make a low-cost yet feature-rich ROS (Robot Operating System) based mobile robot that allow me to experiment my work on autonomous robot at home. To that end, Dolly is designed with all the basic features needed. To keep the bill of material as low as possible, i tried to recycle all of my spare hardware parts.

Specification

  • Robot's chassis is 3D printed, the chassis's plate design is borrowed from the design of Turtlebot 3 which is a smart design, IMO. The other hardware parts, however, are completely different from the Turtlebot 3.
  • IMU sensor with 9 DOF (accelerometer, magnetometer and gyroscope) for robot orientation measurement
  • Two DC motors with magnetic encoders using as wheels and odometer
  • Arduino Mega 2560 for low level control of the robot
  • Raspberry PI 3B+ with embedded Linux for high level algorithm and network communication. The ROS middle-ware on top of the Linux system offers a powerful robotic software environment
  • A 360 degree Neato LiDAR (laser scanner) up to 6 m range
  • A 8 Mega pixel camera (Raspberry PI camera)
  • Adafruit Motor shield V2 for motor controlling
  • 10000 Mah battery
  • ADS1115 analog sensor to measure and monitor battery usage
  • 0.95" (128x64) mini OLED display
  • The robot can be tele-operated using a bluetooth controller such as a PS4 controller

Applications

  • Localization and mapping (SLAM)
  • Obstacle avoidance
  • Autonomous navigation
  • Robot perception algorithms with LIDAR sensor and camera
  • Much more...

Interfacing Raspberry Pi and LPC1114FN28 via SPI

In one of my previous posts, i mentioned about building a toy car project using Raspberry Pi as the brain and the LPC1114FN28 for low level control. This post describes in detail of this hobby project.

Basically, in this project, the Raspberry Pi (running a minimal version of Debian, not Rasbian) acts as a master that :

  1. performs some high level calculation (software algorithm) base on the data it collects from the LPC chip (slave)
  2. Issues a control command to the slave chip (LPC) for low level control
  3. Reading image from Raspberry Camera for some vision stuffs
  4. Takes care of network communication for remote control
  5. Implements easily a lot of funny stuffs...

One question: why do not use the Pi to communicate directly with sensor or actuator ?. Although the Pi is a pretty performance system, it lacks some low level feature that we will need in this project, such as ADC for reading analog sensors, precise PWM hardware controller for motor control, etc. Therefore, i decided to used it along with the LPC chip that is more suitable for these low level stuffs.

Powering the Raspberry PI A+ with 4xAA batteries

This is a migrated version of my Wordpress post, written on : 8 Mars 2015

I used the Pi as a brain of an autonomous car toy project, for such project, autonomy is alway a key factor that must be considered. So i did some researches to find an efficient way to power the Pi with battery. In my project, i used 4xAA batteries as power source ('cause those ones are very popular and easy to find).

Power saving

To save the power, my suggest is to use the PI A+, this version of Pi is the less energy consuming in its family. In headless mode (without HDMI,camera, Idling in the command line), it requires around 100mA - 120mA (with or without an USB wireless attached for network communication).

The first thing you need to do is turn off the HDMI output, this can help you save about 20mA. Notes, without the HDMI, you can only access to the Pi via network using ssh (that is, you need to configure the network to work with ssh before turning the HDMI off).

tvservice -off

Programming the LPC1114FN28 using Raspberry Pi

This is a migrated version of my Wordpress post, written on : 27 February 2015

I had some Raspberries Pi A+ available on the toolbox, and i've just got idea to use one of them in my programmable car toy project. The point is that the Pi will be connected to a circuit based on the LPC1114FN28, in which, the ARM cortex M0 chip is used to collect sensor datas (IR sensor, sonar sensor, etc.) and control the motors on the car. The Pi talks to the LPC1114FN28 via a serial connection (UART or SPI), and takes care of some high level calculations based on the datas provided by the LPC chip, it then can analyse the environment's context and send commands to the slave chip to control the car. With the Pi, i can build an API to program the car's behaviour from distance via the network (using TCP protocol or HTTP protocol, via web). It's quite an interesting sujet for me.

So the first thing comes to my mind is that during the experiments, i will need to frequently update the firmware on the LPC1114FN28, so why not use the PI as a programmer for the LPC chip. The firmware is written and compiled on your PC and then is updated on the slave chip by sending it to the PI, no need to used the USB-serial adapter anymore. In this post, i'll show you how to do it, this is a part of my actual project.

Powered by antd server, (c) 2017 - 2024 Dany LE.This site does not use cookie, but some third-party contents (e.g. Youtube, Twitter) may do.