Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

enableHardwareDecode doesn't function correctly #9

Open
SpencerKaiser opened this issue Sep 26, 2018 · 13 comments
Open

enableHardwareDecode doesn't function correctly #9

SpencerKaiser opened this issue Sep 26, 2018 · 13 comments

Comments

@SpencerKaiser
Copy link

SpencerKaiser commented Sep 26, 2018

After setting enableHardwareDecode to true/YES (Swift/Objc), cv_pixelbuffer_fastupload always has a value of nil/null.

Edit: Although this bug is still open for addressing enableHardwareDecode it’s important to note that part of the issue is a lack of documentation. If you are seeing a value of nil for that value, make sure you add DJIVideoPreviewer.instance()?.registFrameProcessor(self) to your code (see below comments for full context).

@SpencerKaiser SpencerKaiser changed the title enableHardwareDecode doesn't function enableHardwareDecode doesn't function correctly Sep 26, 2018
@dji-dev
Copy link
Contributor

dji-dev commented Sep 28, 2018

Hi @SpencerKaiser ,we just update the iOS SDK Github Sample to demonstrate how to use VideoFrameProcessor, please check if it helps.

dji-sdk/Mobile-SDK-iOS@8137760

@matiasfr
Copy link

matiasfr commented Dec 5, 2018

@dji-dev @SpencerKaiser Were you able to get cv_pixelbuffer_fastupload working in Swift? the pointer isn't null for me but it crashes when i try to cast it as a CVPixelBuffer. Tried commenting in this thread but I don't have enough reputation yet

https://stackoverflow.com/questions/52392672/using-dji-video-feed-with-vision-framework

@SpencerKaiser
Copy link
Author

SpencerKaiser commented Dec 5, 2018

@matiasfr yet another reason why GitHub is a better place for this type of convo then on Stack Overflow 😉 It’s been a while but I believe the missing piece was adding this line where you register your app:
DJIVideoPreviewer.instance()?.registFrameProcessor(self)

After adding that, cv_pixelbuffer_fastupload should have a value and then you can port the code they provided to Swift and use it to cast to CVPixelBuffer:

VideoFrameYUV* yuvFrame; // the VideoFrameProcessor output
CVPixelBufferRef pixelBuffer = NULL;
CVReturn resulst = CVPixelBufferCreate(kCFAllocatorDefault,
                                       yuvFrame-> width,
                                       yuvFrame -> height, 
                                  kCVPixelFormatType_420YpCbCr8Planar,
                                       NULL,
                                       &pixelBuffer);
if (kCVReturnSuccess != CVPixelBufferLockBaseAddress(pixelBuffer, 0) || pixelBuffer == NULL) {
    return;
}
long yPlaneWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 0);
long yPlaneHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer,0);
long uPlaneWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 1);
long uPlaneHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer, 1);
long vPlaneWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 2);
long vPlaneHeight =  CVPixelBufferGetHeightOfPlane(pixelBuffer, 2);
uint8_t* yDestination = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
memcpy(yDestination, yuvFrame->luma, yPlaneWidth * yPlaneHeight);
uint8_t* uDestination = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);
memcpy(uDestination, yuvFrame->chromaB, uPlaneWidth * uPlaneHeight);
uint8_t* vDestination = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 2);
memcpy(vDestination, yuvFrame->chromaR, vPlaneWidth * vPlaneHeight);
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);

My Swift port can be found here!

Let me know if that helps or if there is anything else you run into. Good luck!

@matiasfr
Copy link

matiasfr commented Dec 5, 2018

@SpencerKaiser Hey Spencer, definitely agree on Github being a better discussion medium. Thanks for posting the sample! I got my code working with software decode using your example. I had to turn off hardware decode to insure that the luma and chroma values were being populated. I also found the following comment in the framework header

When fastupload is enabled, luma, chromaB and chromaR may be nil and the pixel information can be accessed in cv_pixelbuffer_fastupload.

However, I'd still like to get my app working with hardware decoding, which means I would want to cast the cv_pixelbuffer_fastupload directly to a CVPixelBuffer rather than copy the values from the frame struct. Let me know if you have any ideas!

@dji-dev is there a way to cast the cv_pixelbuffer_fastupload to a CVPixelBuffer in Swift? I've already tried let pixelBuffer: CVPixelBuffer = frame.pointee.cv_pixelbuffer_fastupload as! CVPixelBuffer to no avail.

@SpencerKaiser
Copy link
Author

Ahh gotcha... yea I definitely didn’t get that working, just the workaround above. Best of luck!

@matiasfr
Copy link

matiasfr commented Feb 20, 2019

It's been a couple months but I had a breakthrough! You can get the pixel buffer with hardware decoding on in Swift by doing:
let pixelBuffer = unsafeBitCast(frame.pointee.cv_pixelbuffer_fastupload, to: CVPixelBuffer.self)

@rreichel3
Copy link

@matiasfr What did you do to enable hardware decoding? I am running into a similar issue where I can set the VideoPreviewer's enableHardwareDecoding property to true however the reference, frame.pointee.cv_pixelbuffer_fastupload is returning null.

@matiasfr
Copy link

@rreichel3 hmm, make sure you are not running in the simulator (hardware decode will only work on device). Did you do DJIVideoPreviewer.instance().start()? is your videoprocessframe method getting called?

@rreichel3
Copy link

@matiasfr Yep I've been testing on my iPhone 8 Plus and my video process frame is getting called. I've attached both my videoProcessFrame and initialization code below. Further, I'm using the latest versions for both the DJI SDK (4.9.1) and DJIWidget (1.3). Do you think it could be due to the fact that I'm using an iPhone 8 Plus?

videoProcessFrame:

func videoProcessFrame(_ frame: UnsafeMutablePointer<VideoFrameYUV>!) {
    if frame.pointee.cv_pixelbuffer_fastupload != nil {
        let cvBuf = unsafeBitCast(frame.pointee.cv_pixelbuffer_fastupload, to: CVPixelBuffer.self)
        let imageRequestHandler = VNImageRequestHandler(cvPixelBuffer: cvBuf, options: [:])
        do {
            try imageRequestHandler.perform(self.requests)
        } catch {
            print(error)
        }
    } else {
        let pixelBuffer = createPixelBuffer(fromFrame: frame.pointee)
        let image = UIImage(ciImage: CIImage(cvPixelBuffer: pixelBuffer!))
        let imageRequestHandler = VNImageRequestHandler(cvPixelBuffer: pixelBuffer!, orientation: .up, options: [:])
        do {
            try imageRequestHandler.perform(self.requests)
        } catch {
            //print(error)
        }
    }

Initialization:

   DJIVideoPreviewer.instance().enableHardwareDecode = true
    DJIVideoPreviewer.instance().registFrameProcessor(self)
    DJIVideoPreviewer.instance().setView(self.fpvView)
    
    
    let product = DJISDKManager.product()
    //Use "SecondaryVideoFeed" if the DJI Product is A3, N3, Matrice 600, or Matrice 600 Pro, otherwise, use "primaryVideoFeed".
    if ((product?.model == DJIAircraftModelNameA3)
        || (product?.model == DJIAircraftModelNameN3)
        || (product?.model == DJIAircraftModelNameMatrice600)
        || (product?.model == DJIAircraftModelNameMatrice600Pro)){
        DJISDKManager.videoFeeder()?.secondaryVideoFeed.add(self, with: nil)
    }else{
        DJISDKManager.videoFeeder()?.primaryVideoFeed.add(self, with: nil)
    }
    DJIVideoPreviewer.instance().start()
    print("Hardware decoding setting after start: ")
    print(DJIVideoPreviewer.instance().enableHardwareDecode)

@matiasfr
Copy link

That looks like it should function to me, odd that it's not working. I think the iphone model shouldn't make a difference; if you can run the dji go app then this should work too. ¯_(ツ)_/¯

@jasin755
Copy link

jasin755 commented Aug 3, 2020

I have same problem. I tested it on physical device (not simulator) and cv_pixelbuffer_fastupload always returns nil. I have tried to do it by SW but I can't create CIImage from VideoFrameYUV.

Code:

CVPixelBufferRef pixelBuffer = NULL;
CVReturn resulst = CVPixelBufferCreate(kCFAllocatorDefault,
                                       frame-> width,
                                       frame -> height,
                                       kCVPixelFormatType_420YpCbCr8Planar,
                                       NULL,
                                       &pixelBuffer);
if (kCVReturnSuccess != CVPixelBufferLockBaseAddress(pixelBuffer, 0) || pixelBuffer == NULL) {
    return;
}

long yPlaneWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 0);
long yPlaneHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer,0);
long uPlaneWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 1);
long uPlaneHeight = CVPixelBufferGetHeightOfPlane(pixelBuffer, 1);
long vPlaneWidth = CVPixelBufferGetWidthOfPlane(pixelBuffer, 2);
long vPlaneHeight =  CVPixelBufferGetHeightOfPlane(pixelBuffer, 2);
uint8_t* yDestination = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 0);
memcpy(yDestination, frame->luma, yPlaneWidth * yPlaneHeight);
uint8_t* uDestination = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 1);
memcpy(uDestination, frame->chromaB, uPlaneWidth * uPlaneHeight);
uint8_t* vDestination = CVPixelBufferGetBaseAddressOfPlane(pixelBuffer, 2);
memcpy(vDestination, frame->chromaR, vPlaneWidth * vPlaneHeight);
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);

[self log:[NSString stringWithFormat:@"Frame width: %@", @(frame-> width).stringValue]];

if(pixelBuffer == NULL) {
    [self log:@"PixelBuffer is NULL"];
    return;
}

if(CVPixelBufferGetPlaneCount(pixelBuffer) == 3) {
    [self log:@"y-cr-cb"];
}

if(CVPixelBufferGetPlaneCount(pixelBuffer) == 2) {
    [self log:@"y-cr-cb-bi"];
}

CIImage *ciImage = [CIImage imageWithCVPixelBuffer:pixelBuffer];

if(ciImage != NULL) {
    [self log:@"CIImage exist"];
} else {
    [self log:@"CIImage doesnt exist"];
}

Log:

Frame width: 1280
y-cr-cb
CIImage doesnt exist
Frame width: 1280
y-cr-cb
CIImage doesnt exist
....

How can I create CIImage from CVPixelBufferRef properly?
Thanks.

@afullerriis
Copy link

If you have an image like this
image
This happens when the aircraft camera's aspect ratio is in 4 by 3 when it should be 16 by 9. Changing to 16 by 9 via the setAspectRatio function corrected my issue.

@aero-jet
Copy link

aero-jet commented Feb 9, 2022

Hi @dji-dev

After debugging the code, I found that it is reaching this path in loadPrebuildIframe() in DJIVideoHelper.m for me. It returns 0.

Screen Shot 2022-02-09 at 11 17 29

The caller of this is DJIH264VTDecode, which consequently sets _hardwareUnavailable to YES and then falls back to using software decoding.

Screen Shot 2022-02-09 at 11 28 00

The reason why loadPrebuildIframe() is returning 0 seems to be because g_loadPrebuildIframePathFunc is initialised to nil and never written to.

Screen Shot 2022-02-09 at 11 18 37

Is this a bug?

EDIT
After digging some more, I found that the default encoder type is H264EncoderType_DM368_inspire. This causes the decoder to try and load a prebuild iframe. I now set the encoder type to H264EncoderType_unknown, which bypasses loading a prebuild iframe and it seems to be working now.

Screen Shot 2022-02-09 at 11 58 29

I am using a Mavic Pro. Does this mean we have to manually map the encoder type to the product connected to the SDK?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants