1

My goal is to find golf ball using iPhone camera, So I did same steps in photoshop and I want to achieve same steps in openCV on image/live video frame start with the original picture.

  • then boost the satturation to get color into light areas
  • the use curves to cut off the edges of the spectrum
  • then convert the image to grayscale
  • use curves again to get to black/white
  • and finally - just for the look - apply a color

--Input Image:

enter image description here

--Output Image:

enter image description here

Would you please help me or give me some hints related image processing with OpenCV in iOS?

Thanks in advance!

Edit I used following code and got the below output Image,

  - (UIImage*) applyToneCurveToImage:(UIImage*)image
    {
     CIImage* ciImage = [[CIImage alloc] initWithImage:image];
     CIFilter*   filter =
     [CIFilter filterWithName:@"CIToneCurve"
               keysAndValues:
     kCIInputImageKey, ciImage,
     @"inputPoint0",[CIVector vectorWithX:0.00 Y:0.3]
     ,@"inputPoint1",[CIVector vectorWithX:0.25 Y:0.4]
     ,@"inputPoint2",[CIVector vectorWithX:0.50 Y:0.5]
     ,@"inputPoint3",[CIVector vectorWithX:0.75 Y:0.6]
     ,@"inputPoint4",[CIVector vectorWithX:1.00 Y:0.7]
     ,nil];

    //CIFilter* filter2 = [filter copy];

    //step1
    filter = [CIFilter filterWithName:@"CIColorControls"
                        keysAndValues:kCIInputImageKey,
              [filter valueForKey:kCIOutputImageKey], nil];


    [filter setValue:[NSNumber numberWithFloat:0]
              forKey:@"inputBrightness"];
    [filter setValue:[NSNumber numberWithFloat:6]
              forKey:@"inputContrast"];

    CIImage* result = [filter valueForKey:kCIOutputImageKey];
    CIContext *context = [CIContext contextWithOptions:nil];

    CGImageRef cgImage = [context createCGImage:result
                                       fromRect:[result extent]];

    UIImage* filteredImage = [UIImage imageWithCGImage:cgImage];
    CGImageRelease(cgImage);

    ciImage=nil;
    context=nil;
    cgImage=nil;
    result=nil;

    return filteredImage;
   }

    - (void)didCaptureIplImage:(IplImage *)iplImage
    {
    @autoreleasepool
    {
        IplImage *orgimage = cvCreateImage(cvGetSize(iplImage), IPL_DEPTH_8U, 3);

        orgimage=[self CreateIplImageFromUIImage:[self applyToneCurveToImage:[UIImage imageNamed:@"GolfImage.jpeg"] ] ];

        Mat matRGB = Mat(orgimage);

        //ipl imaeg is also converted to HSV; hue is used to find certain color
        IplImage *imgHSV = cvCreateImage(cvGetSize(orgimage), 8, 3);       //2

        cvCvtColor(orgimage, imgHSV, CV_BGR2HSV);

        IplImage *imgThreshed = cvCreateImage(cvGetSize(orgimage), 8, 1);     //3

    //    cvInRangeS(imgHSV, cvScalar(_Hmin, _Smin, _Vmin), cvScalar(_Hmax , _Smax, _Vmax), imgThreshed);

        cvInRangeS(imgHSV, cvScalar(0.00, 0.00, 34.82), cvScalar(180.00 , 202.54, 256.00), imgThreshed);

        Originalimage=nil;

         cvReleaseImage(&iplImage);
         cvReleaseImage(&orgimage);
         cvReleaseImage(&imgHSV);

         [self didFinishProcessingImage:imgThreshed];
    }

Output Image:

enter image description here

3
  • Please show some code. Commented Jan 6, 2014 at 8:30
  • I don't understand the question. What do you want to achieve? Your picture looks like a treshold (black substituted with red). Commented Jan 6, 2014 at 8:31
  • 1
    I don't understand the edit. Why have you removed all of the detail from your question?. Commented Jan 7, 2014 at 15:39

1 Answer 1

2

You don't need openCV for any of this. You should be able to get this result using Core Image

See this question How to change minimum or maximum value by CIFilter in Core image?

Where I give a fairly detailed answer on the manipulation of tone curves.

This will cover your steps 2 and 4. For step 1 (saturation) try CIColorControls. For step 3 (convert to grayscale) you could also use CIColorControls, but that would involove dropping saturation to 0, not what you want. Instead you can use CIMaximumComponentor CIMinimumComponent. For step 5, you could use the result of 1-4 as a mask with a flat colour.

OpenCV will allow you to pick out the ball from the rest of the image (I guess this is what you want to achieve, you didn't mention it in your question). You can refer to this question: How does the HoughCircle function works? I can't get it to work properly which I answered with an accompanying demo project: https://github.com/foundry/OpenCVCircles. You can pass the result of your Core Image processing in to openCV by converting to the openCV Mat format from UIImage (that linked project shows you how to do this).

Sign up to request clarification or add additional context in comments.

2 Comments

Thanks @foundry for you reply. please check output in edit section. I have added code which you suggested.
+1 Thanks for your reply and providing links I will look into that.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.