OpenCV 4.0 is now available for download

opencv

OpenCV 4.0 is officially announced.

What’s new?

  • OpenCV is now C++11 library and requires C++11-compliant compiler.
  • Most of C APIs are removed
  • dnn (deep neural networking) module includes experimental Vulcan backend for platforms missing OpenGL
  • QR code detector and reading is add to objdetect module

You can see all in the OpenCV change log

 

 

Hackaday Belgrade 2018 is this weekend 26th of May. Get ready for Retro Computing BASIC badge hacking.

2018-belgrade-badge-featured

This weekend 26th of May Hackaday will have their second conference in Belgrade.

You can see the program here.

The conference badge is cool retro computer running BASIC. There will be badge hacking workshop so we will get with us some PIC-KIT3s .

I will have talk about how we hacked Soldering robot with ugly programming interface with TERES-I laptop and FPGA / Sigrok and how we replaced the soldering robot brain with OLinuXino-MICRO, so now it’s ready to take CAD CNC files and use fiducials and do AOI inspection after soldering.

 

Amazing Quadcoper video shows how compact and powerful could be our computers today

4copters

Interesting video at Ted shows how advanced is the technology we have our hands on now.

Computers get smaller and lightweight but more and more powerful so we can attach them even to flying Quadcopters and make them to process video and control multiply motors.

A13-OLinuXino-WIFI and OpenCV Face detection project

Image

We continue the experiments with OpenCV, this new project detect human face, mouth, eyes and upper head position.

When human Face is captured by the camera it’s coordinates are detected then with push button switch you can attach Mustache, Glasses or Horns on the detected Face.

The project sources are on GitHub: https://github.com/OLIMEX/OLINUXINO/tree/master/SOFTWARE/A13/A13-OLinuXino%2BOpenCV-Face

Here is video : http://www.youtube.com/watch?v=pLHQZte-FuY

A13-OLinuXino-WIFI running SCRATCH + OPEN-CV demonstration

Image

Now when we know how to interface new blocks in Scratch IDE with python and having OpenCV library up and running on OLinuXino it was matter of time to make the new project.

There is already nice project written in Scratch with eyes tracking your mouse pointer: http://scratch.mit.edu/projects/Doody/269924

Also we show you OpenCV object tracking http://www.youtube.com/watch?v=CigGvt3DXIw

Now we decided to combine both in single project:

1. Install scratchpy

# git clone git://github.com/pilliq/scratchpy.git
# cd scratchpy

    !!!IMPORTANT: Make sure that you date is set:
# date -s "3 APR 2013"
# python setup.py install

2. Install OpenCV

# apt-get install libopencv-dev python-opencv

3. Install Scratch

# apt-get install scratch

4. Open Scratch and open the demo project Eyes.sb

5. A message should appear that remote sensor connection is enabled.

If doesn’t click on “Sensing”, right click on sensor value and click “Enable remote sensor connection”.

6. Open terminal and start the tracking program

# python track.py

7. This will track yellow ball and the eyes will look at this direction

8. If you want to use some other object with different color use config.py.

Start the config:

# python config.py
Move the sliders until only the desired color is white and everything else is black.

Open track.py and edit the following line:
cv.InRangeS(imgHSV,
 cv.Scalar(0, 100, 255),
 cv.Scalar(40, 255, 256),
 imgThreshed)

Write your values, for example:

cv.InRangeS(imgHSV,
 cv.Scalar(110, 98, 100),
 cv.Scalar(131, 255, 256),
 imgThreshed)

Then just start the file:

# python track.py

If you wonder what does line 77 k = cv.WaitKey(70) this is delay as OpenCV track the object very fast and send coordinates faster than Scratchpy + Scratch can handle which leads to buffer overflow, with this delay the object tracking is artifically delaed to may Scratch have time to update the animation properly.

Here is video of this project: http://www.youtube.com/watch?v=Pyidx-zOsm4

Note that the camera is put on the top of the monitor and displays mirrored object i.e. when you move the heart right the animation moves the eyes to the right but on the camera you see mirrored image 🙂

And the GitHub sources are HERE

Next OpenCV project – Face tracking is in progress, stay tuned 🙂

Make Door Security Logger with A13-OLinuXino-WIFI + OpenCV

Image

This is cool little project done in minutes with A13-OLinuXino running OpenCV. We were thinking what to make with OpenCV and with the use of GPIOs on A13-OLinuXino and decided to put small switch on our laboratory door connected to A13-OLinuXino GPIO:

Image

then to wire A13-OLinuXino with Web Cam on the old ping-pong table in the front ot the door, so we can sense every time door is opened and closed:

Image

OK, now we are ready and have just to write the python code to log the pictures with the Web-cam every time somebody enters the lab:

from cv2 import *
import sys
import time
import datetime
import A13_GPIO as gpio

def main():
    #init gpio module
gpio.init()
 gpio.setcfg(gpio.PIN36, gpio.INP)

    while True:
        #select /dev/video0 as source
cam = VideoCapture(0)
#wait for low level (door open)
        
        while True:
            g = gpio.input(gpio.PIN36)
            if(gpio == 0):
                break
            time.sleep(2)
 
            #take 15 pictures, and use only the last one
 for i in range(15): 
                s, img = cam.read()
 
            #get the current system time 
            now = datetime.datetime.now()
            k = str(now)
            if s:
                imwrite(k + ".jpg", img)
                print(k + " -> New image saved...")
            
            #wait for high level (door closed)
            while True:
                g = gpio.input(gpio.PIN36)
                if(gpio == 1):
                    break
 
             #wait some time (debounce)
             time.sleep(1)

if __name__ == '__main__':
   main()

You can download the project code and OpenCV installation instructions on GitHub: https://github.com/OLIMEX/OLINUXINO/tree/master/SOFTWARE/A13/A13%2BOpenCV%2BDoor-Security

Scratch + A13-OLinuXino-WIFI Physical Computing

Image

Scratch is great visual environment very easy to understand by children.

I already blogged that Scratch is running fine on OLinuXino here https://olimex.wordpress.com/2013/03/12/a13-olinuxino-wifi-running-scratch-ide-perfect-platform-for-kids-programming-education/

What bothered us though was that we can’t use all GPIOs and hardware resource which OLinuXino offers and Scratch basic installation can only play music and make cool animations.

So we start searching info how to implement this.

Soon we found SNAP http://snap.berkeley.edu/ which is based BYOB (Build your own Blocks) which is based on Scratch but adds ability to make your own blocks, recursion etc new features.

SNAP is implemented with JavaScript so even better it runs on your Browser without need of installation. Based on SNAP there are few cool projects made by Technoboy10 (https://github.com/Technoboy10) interfacing Wii Nunchuk , Arduino and NXT-Lego robots.

I asked Technoboy10 is there documentation how to connect Hardware to SNAP and soon he published this info in the Wiki: https://github.com/Technoboy10/snap-server/wiki/How-to-create-a-Snap!-Extension which is very detailed.

So How cool is this? Kids are able to program robots with visual IDE!

Looking at Technoboy10 GitHub repository we figured out how to do this, but SNAP basic installation have no such great choice of sprites and animation examples, so I decided for the beginning to try to implement OLinuXino hardware features to SCratch.

So I posted question on Scratch forum and start to wait for reply http://scratch.mit.edu/forums/viewtopic.php?id=115821 . After few days waiting, 30+ views and none reply I did what I had to do at the beginning: searched github for “scratch c” then for “scratch python” and found what we needed:
https://github.com/pilliq/scratchpy.

This is cool python client for Scratch. When you enable remote sensors in Scratch it creates server, then with scratchpy you can pool for given variables and take their values or change them, this way you can connect sensors or actuators.

As opposite to SNAP (where SNAP i client and snap-server is made with pyton, so SNAP connects to the server) here Scratch makes server and the scratchpy client connects to it and interact through variables.

So here is our first “Hello World” blinking LED with Scratch: http://www.youtube.com/watch?v=qbTNWTa5tXQ

Instruction how to install the OLinuXino GPIO interface to Scratch is on GitHub: https://github.com/OLIMEX/OLINUXINO/tree/master/SOFTWARE/A13/A13-GPIO%2BSCRATCH

Our second project animate GPIO connector with Scratch based on real buttons inputs and LED outputs:

Image

Here is video on the GPIO interaction with SCRATCH http://www.youtube.com/watch?v=DzmvqlQodac

Now when we have this powerful tool: scratchpy to interface Scratch to anything via python, just imagine what would happend if we connect Scratch to … OpenCV 🙂

From my previous blog I show you some videos which demonstrate the power of OpenCV: face recognition, face expression recognition, color objects tracking.  You can make SCRATCH script for instance which recognize the face of the person in the fron of the web cam and run some animation to welcome it by name.

Or you can program your robot to chase this yellow pong ball which is in the front of it.

Or you can make different animations depend on the face expression of the person in the front of the camera… the applications and the fun will be endless!

Previous Older Entries