-
Notifications
You must be signed in to change notification settings - Fork 30
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
enableHardwareDecode doesn't function correctly #9
Comments
Hi @SpencerKaiser ,we just update the iOS SDK Github Sample to demonstrate how to use VideoFrameProcessor, please check if it helps. |
@dji-dev @SpencerKaiser Were you able to get cv_pixelbuffer_fastupload working in Swift? the pointer isn't null for me but it crashes when i try to cast it as a CVPixelBuffer. Tried commenting in this thread but I don't have enough reputation yet
|
@matiasfr yet another reason why GitHub is a better place for this type of convo then on Stack Overflow 😉 It’s been a while but I believe the missing piece was adding this line where you register your app: After adding that,
My Swift port can be found here! Let me know if that helps or if there is anything else you run into. Good luck! |
@SpencerKaiser Hey Spencer, definitely agree on Github being a better discussion medium. Thanks for posting the sample! I got my code working with software decode using your example. I had to turn off hardware decode to insure that the luma and chroma values were being populated. I also found the following comment in the framework header
However, I'd still like to get my app working with hardware decoding, which means I would want to cast the cv_pixelbuffer_fastupload directly to a CVPixelBuffer rather than copy the values from the frame struct. Let me know if you have any ideas! @dji-dev is there a way to cast the cv_pixelbuffer_fastupload to a CVPixelBuffer in Swift? I've already tried |
Ahh gotcha... yea I definitely didn’t get that working, just the workaround above. Best of luck! |
It's been a couple months but I had a breakthrough! You can get the pixel buffer with hardware decoding on in Swift by doing: |
@matiasfr What did you do to enable hardware decoding? I am running into a similar issue where I can set the VideoPreviewer's enableHardwareDecoding property to true however the reference, |
@rreichel3 hmm, make sure you are not running in the simulator (hardware decode will only work on device). Did you do |
@matiasfr Yep I've been testing on my iPhone 8 Plus and my video process frame is getting called. I've attached both my videoProcessFrame and initialization code below. Further, I'm using the latest versions for both the DJI SDK (4.9.1) and DJIWidget (1.3). Do you think it could be due to the fact that I'm using an iPhone 8 Plus? videoProcessFrame:
Initialization:
|
That looks like it should function to me, odd that it's not working. I think the iphone model shouldn't make a difference; if you can run the dji go app then this should work too. ¯_(ツ)_/¯ |
I have same problem. I tested it on physical device (not simulator) and Code:
Log:
How can I create |
Hi @dji-dev After debugging the code, I found that it is reaching this path in The caller of this is The reason why Is this a bug? EDIT I am using a Mavic Pro. Does this mean we have to manually map the encoder type to the product connected to the SDK? |
After setting
enableHardwareDecode
totrue
/YES
(Swift/Objc),cv_pixelbuffer_fastupload
always has a value ofnil
/null
.Edit: Although this bug is still open for addressing
enableHardwareDecode
it’s important to note that part of the issue is a lack of documentation. If you are seeing a value ofnil
for that value, make sure you addDJIVideoPreviewer.instance()?.registFrameProcessor(self)
to your code (see below comments for full context).The text was updated successfully, but these errors were encountered: