WordPress database error: [Got error 28 from storage engine]
SHOW FULL COLUMNS FROM `wp_options`

WordPress database error: [Got error 28 from storage engine]
SHOW FULL COLUMNS FROM `wp_options`

WordPress database error: [Got error 28 from storage engine]
SHOW FULL COLUMNS FROM `wp_options`

Erreur de la base de données WordPress : [Got error 28 from storage engine]
SHOW FULL COLUMNS FROM `wp_options`

Erreur de la base de données WordPress : [Got error 28 from storage engine]
SHOW FULL COLUMNS FROM `wp_options`

Erreur de la base de données WordPress : [Got error 28 from storage engine]
SHOW FULL COLUMNS FROM `wp_options`

Erreur de la base de données WordPress : [Got error 28 from storage engine]
SHOW FULL COLUMNS FROM `wp_options`

Erreur de la base de données WordPress : [Got error 28 from storage engine]
SHOW FULL COLUMNS FROM `wp_options`

Erreur de la base de données WordPress : [Got error 28 from storage engine]
SHOW FULL COLUMNS FROM `wp_options`

Erreur de la base de données WordPress : [Got error 28 from storage engine]
SHOW FULL COLUMNS FROM `wp_options`

Erreur de la base de données WordPress : [Got error 28 from storage engine]
SHOW FULL COLUMNS FROM `wp_options`

AirGesture project – Genesis-blog

Genesis-blog

chef

AirGesture project

The current article has been submitted by a client who wanted to detect gestures over the device camera, without touching screen.

State of the art

There are several proprietary and open source libraries doing this :

Chosen solution

We have chose to use Handwave library with OpenCV.

Currently, the last stable version of OpenCV compatible with Handwave is the 2.4.12, wich is not compiled for 64 bits architecture.

Then, Handwave Library is not compatible with OpenCV 3.10

We found this github library NewHandWave that have already done all the work of upgrade / recompilation in 32bits platforms : source

What is OpenCV?

OpenCV (Computer Vision) is a library developped in C/C++ under a BSD licence and hence it’s free for both academic and commercial use.

It has C++, C, Python, Java interfaces and supports Windows, Linux, Android, iOS and Mac OS.

This library offers a large set of algorythms to process and analyse videos and pictures through camera input.
For example, it can help you set up a face detection application or a pattern recognition.

OpenCV was initialy developed by Intel developer, and is now maintened and expended by over 40 000 active members on their github repository

Here is a quick presentation of the different modules available through the C API :

  • core : Main functionnalities.
    this library is used to manipulate basic structures, matrix operations, draw on pictures, save and load data in XML files…
  • imgproc : Image processing.
    Image manipulations, geometrical transformations, filters and outline and shapes detection,…
  • features2d : Descriptors.
    Descriptors extractors through two different approach (SURF and StarDetector)
  • objdetect : Object detection.
    Used for object recognition (for example, faces, eyes, cars,…) via Adaboost algorythm (Viola & Jones, 2001)
  • video : Video analysis.
    Includes motion estimation, background subtraction, and object tracking algorithms in a video flux
  • highgui : UI.
    UI for video capturing, image and video codecs, as well as simple UI capabilities. This also includes functions allowing to easily make GUI for our applications
  • calib3d : Single and stereo camera calibration.
    This module allows to reconstruct 3D elements multiple camera input using basic multiple-view geometry algorithms.

HandWave gesture recognition

Handwave regognition library is not a hand recognition pattern.

In order to retrieve the motion, first, it converts the input frames of the camera to a grey scale, then calculate the mean of the light intensity for each frame and store it in a buffer for further comparaison using the OpenCV library.

Once the buffer reaches a 100 frames, for calibration, it detects the changes in intensity from the new frames and determines an entry point of an object by a decrease of a luminosity. The exit point is then determined when the luminosity is increasing back to the normal intensity. The buffer of light intensity is then recalculated after each gesture recognition.

Once we have the entry and exit point of the gesture, HandWave calculate and compare the Abscissa and ordinate between those to point in order to determine the direction of the motion.

            double diffY = mEndPos.y - mStartPos.y;
            double diffX = mEndPos.x - mStartPos.x;
            
            if (Math.abs(diffX) > Math.abs(diffY)) {
                if (Math.abs(diffX) > mMinDirectionalMotionX && Math.abs(gestureLength) > mMinGestureLength) 
                {
                    if (diffX > 0) 
                    {
                        movementDirection = Direction.Left;
                    } 
                    else 
                    {
                        movementDirection = Direction.Right;
                    }
                }
            } 
            else 
            {
                if (Math.abs(diffY) > mMinDirectionalMotionY && Math.abs(gestureLength) > mMinGestureLength) 
                {
                    if (diffY > 0) 
                    {
                        movementDirection = Direction.Down;
                    } 
                    else 
                    {
                        movementDirection = Direction.Up;
                    }
                }
            }

Integration

To be fully functional, both libraries “OpenCV310” and “HandWave” must be imported as modules and the app need a library dependcy on them.

Initialization

The activity, or fragment, you want to recognize air gestures must have the JavaCameraView widget in it.

The recognition area depend on the size of the widget.

The camera having big resolutions and the library didn’t want to be hardware-specific, we presume that this has a direct link with the dpi screen and distance calculation

<org.opencv.android.JavaCameraView
        android:layout_width="match_parent"
        android:layout_height="match_parent"
        android:background="@android:color/transparent"
        android:id="@+id/camera_preview"
        />

In your activity, while the scene is created we need to load the OpenCV and Handwave library

protected void loadOpenCV()
    {

        if (!OpenCVLoader.initDebug()) {
            Log.d(TAG, "Internal OpenCV library not found. Using OpenCV Manager for initialization");
            OpenCVLoader.initAsync(OpenCVLoader.OPENCV_VERSION_3_1_0, this, mLoaderCallback);
        }
        else {
            Log.d(TAG, "OpenCV library found inside package. Using it!");
            mLoaderCallback.onManagerConnected(LoaderCallbackInterface.SUCCESS);
        }
    }

This piece of code will lead the application to display a popup inviting the user to download the “OpenCV Manager” application, which contains all openCV libraries compiled. (for 64 bits, only the 3.10 version)

When using the OpenCV Manager, the callback will be necessary


 /** OpenCV library initialization. */
    private BaseLoaderCallback mLoaderCallback = new BaseLoaderCallback(this) {
        @Override
        public void onManagerConnected(int status) {
            switch (status) {
                case LoaderCallbackInterface.SUCCESS: { 
                    mOpenCVInitiated = true; 
                    CameraGestureSensor.loadLibrary(); 
                    mGestureSensor.start();     // your main gesture sensor object 

                } break;
                default:
                {
                    super.onManagerConnected(status);
                } break;
            }  
        }
    }; 

Execution

Once the JavaCameraView implemented and initialized, the activity (or fragment) must implements CameraGestureSensor.Listener

    @Override
    public void onGestureUp(CameraGestureSensor caller, final long gestureLength) {
        Log.i(TAG, "Up");
    }

    @Override
    public void onGestureDown(CameraGestureSensor caller, final long gestureLength) {
        Log.i(TAG, "Down");
         // Do some stuffs
    }

    @Override
    public void onGestureLeft(CameraGestureSensor caller, long gestureLength) {
        Log.i(TAG, "Left");
        // Do some stuffs
    }

    @Override
    public void onGestureRight(CameraGestureSensor caller, long gestureLength) {
        Log.i(TAG, "RIght");
         // Do some stuffs
    }

While pausing and resuming the activity, we will not forget to setup and destroy camera recognition :

    @Override
    public void onPause()
    {
        super .onPause();
        stopCamera();
    }
    @Override
    public void onResume()
    {
        super .onResume();
        setupCamera();
    }
    
    public void stopCamera() {
        if (mGestureSensor != null) {
            mGestureSensor.stop();
            mGestureSensor.removeGestureListener(MainActivity.this);
            mGestureSensor = null;
        }
    } 

    public void onRequestPermissionsResult(int requestCode, String[] permissions, int[] grantResults){
        if (ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA) != PackageManager.PERMISSION_GRANTED
                && grantResults[0] != PackageManager.PERMISSION_GRANTED) {
            Toast.makeText(this, "The camera permission is not granted", Toast.LENGTH_LONG).show();
            return;
        }
        else {
            setupCamera();
        }
    }

You can find all the sample application on the AirGesture repository

What about iOS ?

Since the openCV is not embbedded in an official framework for Swift support, it must be manually embedded and recompiled.
There are a few open source repository available, like
swift 3 openCV or Tesseract Swift (specialized in OCR)

there a little tutorial to insert a cpp static framework in a swift project

How to use cpp with swift

8
8

Laissez un commentaire

Erreur de la base de données WordPress : [Got error 28 from storage engine]
SHOW FULL COLUMNS FROM `wp_options`