DISPLAY – Simple LandTiger NXP LPC1768 video player

The aim of this article is to briefly explain how to play a short video (in terms of sequence of frames) using the LandtTiger LPC1768 board and the connected ILI9325 LCD screen (ILI9320 on some board).

First of all we need to find out a way to convert an input video in a compatible format. In order to do that, the most direct way – and the one we will take – is to extract a sequence of frames from it and then print those extracted frames to the screen, with specific delay constraints.

Extracting frames

For this purpose the choice fell on ffmpeg – an extremely powerful tool that converts and edits audio and video streams.

After installing it, let’s extract a resized portion of the input video

ffmpeg -i INPUT_VIDEO -vf scale=”-2:HEIGHT,crop=WIDTH:HEIGHT” -ss START_TIME -t DURATION “VIDEO_NAME_cut.mp4”

where the non-obvious parameter are WIDTH and HEIGHT since:

  • in the first part you can omit one of the two (by replacing with a -1 or -2 as appropriate)
  • in the crop part they must be consistent with the scaled down size above specified

Now let’s actually extract the frames

ffmpeg -i “VIDEO_NAME_cut.mp4” -vf fps=1/FRAME_OFFSET -vcodec rawvideo -f rawvideo -pix_fmt rgb565be -f image2 ./$filename%03d.raw

I will explain in detail some of the arguments of the previous command:

  • fps=1/FRAME_OFFSET -> “take a frame from source every FRAME_OFFSET seconds”
  • rawvideo -> no header inserted, only pixel representation
  • -pix_fmt rgb565be -> the flow of data from the Board to the LCD is – #in theory – 16-bit based. What we are asking here is “I want every pixel to be represented in RGB 16 bit convention – Big Endian” [more here]
  • output frames will be saved with the .raw extension and with increasing numbers in the filename

Image conversion

Now that we have in our folder a bunch of .raw frames, the next step is to convert them into a C-like structure, that will be imported into our project.

As shown in a previous Special Project (thanks to Gianni Cito), the free and open source image editor GIMP comes to our rescue since it has “Source Code” as output format for our input image. Unfortunally – unless we master the GIMP’s ScriptFu scripting language – this is a time-consuming operation and does not allow us to convert multiple images all at once.

We already have, instead, a partial conversion, i.e. a flow of bytes that can be converted in a C array with the following commands

find . -name “*.raw” | xargs -r -I{} cat “{}” > video_merged
xxd -i video_merged > array.h

The former one merges all the frames into a big file – that somehow represents our video – while the latter makes a dump of the input file and finally generates the .h array!

Ok.. and now?

After this long conversion phase, what we have is an array of unsigned char[] 16-bit value and we want to make it a const char[] one.

Why const?

Because the LandTiger LPC1768 has 512kB of Read-Only flash memory – which is a relatively large amount of memory – that we can use instead of filling the code memory.

How does it work?

  1. Import the converted .h array
  2. Create a video object from it
  3. Set the appropriate RIT interval value
  4. Enable RIT

You can find the code, the implementation details and an explaination of the import operations on the GitLab repository.

More about timing

In point 3) I said that we must find a proper value for the RIT initialization. A strict timing must be respected in order to obtain an acceptable video playback.

First of all, we need to wait a certain number of milliseconds between two consecutive frames, given by input video’s fps.
For example, a 25fps video will lead to a 40ms delay between frames.

Then, we need a fast way of transferring the frame’s byte stream to the LCD. The GLCD library comes with a LCD_SetPoint function, which accepts the X-Y coordinates and the 16-bit color value as parameters, but it is too slow for video playback purposes! In fact, what it does is to perform three writes to the LCD, two for the indexes selection (to LCD’s registers R20h and R21h) and one for the output RGB value (R22h).

Please note that each write actually hides a double write to LCD (one to the IR index register and one with the actual value). Furthermore, although the ILI932X LCD contemplates a 16-bit interface, the way it is connected to this board does only allow a 8-bit transfer (only 8 out of 16 pins connected).

Definitely, we need to dig into the LCD manual in order to find a faster (and more constant in terms of delays) way of transferring pixels.

Luckily, ILI932x LCD offers a Window Address Area mode, that allows the user to select a subset of the screen. By writing to the R50-53h registers we set the four corners coordinates. Finally, we can send all the frame’s pixels by performing a single index register write (R22h: data transfer) and a flow of pixels transfer. Refer to the following picture

Note: with the basic LCD_SetPoint we would have needed at least 5 writes between each consecutive pixel (not taking into account the 16-bit interface problem)

Showcase

In conclusion

A few other words to summarize our results and outline some open problems.

We’ve managed to display a short sequence of frames so as to give us the feeling of a video. The conversion sequence is automatic so that we can start it and import everything inside our project.

You are supposed to launch the commands on a Unix/Linux distribution, even if there are tools that make them available on Windows too.

Achieved goals

  • Video playback
  • Convert input video
  • Chance of zooming in the image (there will be losses on quality though)

Limitations

  • Unfortunately, the 512kB flash memory is big enough to only fit small informations (basically a sequence of some dozens of 50×50 px frames)
  • Long Board flashing times
  • Complex inclusion operations: we need to include the headers and re-compile every time

📽️ We have overcome these limitations with our improved Audio Visual player, in which the SD card is used to load any pre-converted video and a – – somehow complex and tailored – synchronization between audio and video allows the correct playback (in a slightly higher resolution)!


Federico Bitondo, s276294

Prof. Paolo Bernardi

PoC – Emulation of LCD and TP for the LandTiger board

Hi, my name’s Gabriele Filipponi, I’m making this write-up to demonstrate a proof of concept I made a while ago to emulate the LCD and Touch Panel drivers.

As you know, Keil offers a lot of debug utilities to monitor the cores peripherals on board, but it certainly lacks the ones that we were getting used during the course ‘Architettura dei Sistemi di Elaborazione’ like for example the LCD and TP.

The emulation was achieved with a Python script and a library made by Keil’s developers named ‘UVSC.dll’ that offers a network based interface to control Keil.

For the demo I wrote a simple project that displays a RED background with a label in the center ‘LCD Emulation’ and two buttons at the bottom:

  • “Red”
  • “Black”

Depending which one you tap it makes the screen RED or BLACK.

Here it is the demo:

Demo of the Emulator

TOUCH SCREEN – Creating a Graphics Interface handling Images, Buttons and Drawing functionalities on LandTiger

Gianni Cito: s261725@studenti.polito.it

Prof. Paolo Bernardi: paolo.bernardi@polito.it

The goal of this project is to design an application where it can be exploited some of the graphics features available on the LandTiger. Basically, we are going to see how to create a GUI interface on our LCD screen setting and labeling images, text and buttons in a specific position of the . Once provided this initial theoretical points, we will move to the creation of a drawing scene where we can draw freehand lines or text, change colors, draw lines, circles and rectangles.

Let’s start describing the points that we are going to see to reach the final goal.

Calibration

The first window that appears in our application is the ‘Calibration’ one. This part of the application is extremely important to calibrate the Touch Panel touching some targets plotted on the screen.

Images

Let’s see how to show images in the LCD screen, as in the above picture.

To import any kind of image inside the application, first of all, we need to convert it in an .c file. To do that we simply use GIMP, an open-source graphics editor. So, let’s see step by step how to convert any image in any format to a .c image file.Import an image in GIMP

  1. Import the image in GIMP

2. Then, File > Export as and in the dialog that appears, select Source Code which has as extension the c files.

3. In the new dialog, enter a prefixed name, check “Use macros instead of stuct” and “Save as RGB565 (16-bit)” and then hit “Export“.

4. Now a .c file is created and look like this:

Since the data are static, we can include the .c file inside the location where we want to use the image (#include “image.c”) and then refer the expected BITMAP to the static variable, in our case LOGO_pixel_data.

An example of an image inserted in the project application.

Buttons

In order to create buttons, I locate an image in the position that I want and then I stored in a linked list the coordinates of the image edges. Doing this, once a click on the Touch Panel will be caught, a function specially created will try to define where a button will have been clicked or not just checking if the coordinates are within the boundaries of the image.

In the above image, we can see the image boundaries coordinates (xN, yN) stored in the linked list needed to detect if the click, depicted by the coordinates (x, y), are inside or outside the image.

Drawing Scene

The drawing scene is the most important scene of this project. Many of the graphics functionalities are used and described right in this scene. We will see how to:

  • draw freehand with a brush;
  • use a rubber to erase content from the screen;
  • draw predefined shapes, likes circles, rectangles or lines;
  • change color of the next points that we are going to draw on the screen.

To use one of the functionalities listed above, we need just to press on the buttons depicted in the topbar. A button will be selected when a border appears around it.

Brush

With the Brush we can simply draw on the screen freehand, as depicted in the following screenshot.

Rubber

With the Rubber, instead, we can erase the content already drew on the screen.

Inside the Drawing Scene, that we will see in the next chapter, I also added some dropdown menus to show some further options which belong to the same category, like Circles, Rectangles and Lines belongs all to the Shapes category. The same thing happens for colors.

Pressing for few seconds on buttons with a small arrow on bottom, we can see that a menu appears from the top to the bottom of the screen, giving to the user the possibility to choose one of the other options provided.

An important functionality that I implemented in the dropdown menu is that if the pixels that will be replaced by the dropdown menu are not empty, their content will be stored before showing the menu and then restored before it will disappear.

Lines

To draw a perfect line, you need just to press on the screen two times. The first time for the first point and the second time for the second one. A line will appear joining the two points chose.

To draw lines, I chose the Bresenham’s algorithm and I adapted it to this project.

Circles

To draw circles, as lines, you need to press on the screen two times choosing two points. This time, the first point will be the center, while the second one will be the radius from the center.

Rectangles

To draw rectangles, as for the previous shapes, you need to press on the screen two times. The first point detected will be the top-left edge point, while the second one will be the bottom-right edge point. With these two points, draw a rectangle is very easy. We will use top and bottom coordinates to draw vertical lines, and left and right coordinates to draw horizontal lines.

Colors

The colors menu, allows the user to choose what color use to draw.

As we can see from the above screenshot, the dropdown menu shows 6 colors. These ones are not represented by 6 different drops images filled by different colors, but we have just a single image reused for all the colors.

To do this, I used a movie technique, that is the green screen technique. Using a drop image, filled of red (any color is fine as long as it is different from white and black, because background is white and the borders of the drop are black), we can replace the color we choose with someone we want. I created a specific function to perform this operation, that when encounter the color to replace, it changes to that one desired.

USB – remote control of mouse pointer and text of a PC from LandTiger

For my project I choosed Mbed, an open source operating system made by Arm. It’s a good operating system for IoT and embedded systems in general. Apart from a very active community, it has a wide variety of supported boards that act like a plug’n play devices and are very convenient to program.

Ofcourse the LandTiger is not one of the supported boards.

Fortunately Mbed support a board called “mbed LPC1768” that, as you can probably guess, has the same processor as that of the LandTiger.

I will now guide you through the various steps needed to create a program, compile it and flash it in to the board.

As we will see to flash the program is a bit more complex than “usual”, but if you get used to it, it will only take a couple of minutes for the entire procedure.

What do you need:

  • A LandTiger board.
  • One or two serial cable (if you want to use the board as a USB function like a mouse or a Keyboard).
  • An Ulink 2 (or a cheaper USB to JTAG programmer) to flash your code into the board.
  • Keil uVision 4 (the free ediction is perfectly fine).
  • A good amount of patience.

Let’s see how to set everything

HowTo

Simple Mouse:

Take me home, country roads:

Gotta catch’em all:

WiFi Keyboard:

Touch Keyboard and Trackpad:

LEDS – Modulation of a Landtiger LED by using the Potentiometer

Led modulation though the Pulse Width Modulation peripheral with adjustable duty cycle and frequency of the PWM signal on a LanTiger V2.0 NXP-LPC1768 Development Board.

Polytechnic of Turin, A.Y. 2019/2020.
Prof. Paolo Bernardi paolo.bernardi@polito.it
Francesco Angione s262620@studenti.polito.it

Demo of the bare metal project

The goal of this project is to understand the behaviour of a PWM peripheral and appreciate it driving a LED with its output signal.

The starting point has been the following figure:

From which the following formulas can be derived:

  • fpwm= fclk/(divisor+1)(max+1)
  • duty cycle = reg/(Max +1)

    Note: divisor and max are considered as binary values, thus, a one in this formulas has been added.

Those formulas are used in the software in order to change the duty cycle and the frequency accordingly.

In order to achieve that, every fifty milli seconds the ADC is triggered and the value of joystick is sampled. As consequence, the interrupt handler of the ADC will obtain the digital value of the voltage on the potentiometer using it as new value of the duty cycle. Meanwhile, the joystick will be in charge to increase or decrease the frequency of the output signal of the PWM peripheral. Nevertheless this continuous acquisition of the values might be stopped pressing Key 1, freezing the acquisition and locking the duty cycles and the signal frequency to the current ones.

For making the application more user friendly, a simple Graphical User Interface has been added in order to have an instantaneous feedback about all the acquired values and the operations that could be performed on the provided LCD.

As last step, the Led Modulation has been migrated from a simple bare metal project to a one power by a RTOS ( Real Time Operating System ) in particular the Micrium-C/OS III has been used.
In order to achieve that, the operating system has been “ported” to the LandTiger evaluation board.
After that, the previous application has been splitted in three main tasks, the GUI task ( with the highest priority ), and the other ones for the ADC trigger and the sampling of the joystick.
The tasks for the ADC and the joystick have the possibility to wake up the GUI task in order to update the values on the LCD screen everytime they detect a variation.

Reference:

– LPC1768 manual UM10360 Chapter 24: Pulse Width Modulation

– LPC1768 manual UM10360 Chapter 9: General Purpose Input/Output

– LPC1768 manual UM10360 Chapter 29: Analog to Digital Converter

– LPC1768 manual UM10360 Chapter 22: Repetitive Interrupt Timer

– µC/OS-III The Real Time Kernel, Jean J. Labrosse, Freddy Torres

Link to GitLab Repository ( bare metal )

Link to GitLab Repository ( Led Modulation with Micrium – C os III )