Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Speedup in Java code #4

Open
rrahmati opened this issue Mar 19, 2019 · 13 comments
Open

Speedup in Java code #4

rrahmati opened this issue Mar 19, 2019 · 13 comments

Comments

@rrahmati
Copy link

Instead of the for loop to create the bitmap in the java code, use this:
IntBuffer intBuf = ByteBuffer.wrap(frame).order(ByteOrder.BIG_ENDIAN).asIntBuffer(); int[] pixels = new int[intBuf.remaining()]; intBuf.get(pixels); bmp = Bitmap.createBitmap(pixels, width, height, Bitmap.Config.ARGB_8888); ivPreview.setImageBitmap(bmp);

It makes the code an order of magnitude faster!
Also, in the C++ code, change AV_PIX_FMT_RGB to AV_PIX_FMT_ARGB.

@alsaleem00
Copy link

Thanks. The original code barely achieved 4FPS in the emulator.

@dylanatkeylogic
Copy link

Ancient bump... but:

Swapping for this code gives me an ArrayOutOfBounds Exception on this line:

bmp = Bitmap.createBitmap(pixels, width, height, Bitmap.Confit.ARGB_8888);

It seems like the pixel array isn't large enough to play this 1920x1080 video. Any thoughts?

@alsaleem00
Copy link

@dylanatkeylogic
you have to modify the code in c++ to match the pixel format + width/height.
I used it with 4M image and got no problem (it consumes about 42MB from OS)
Another thing, you will get OOM if you do not control the rate at which you send the frames to java.

@dylanatkeylogic
Copy link

@alsaleem00 I see!

This may be a very ignorant question, but if I'm using the library via jitpack/gradle implementation, where would I find the C++ file in question to edit?

I can't seem to find it within the file structure.

@alsaleem00
Copy link

alsaleem00 commented Mar 24, 2022

it is here . in the C++ code, change AV_PIX_FMT_RGB to AV_PIX_FMT_ARGB (3 places)
You can not do changes to c++ if you are using the libarary with build.gradle dependency. You need to get the sources (clone) then build locally after making the changes.

@dylanatkeylogic
Copy link

Interesting - still having the same issue with the OutOfBoundsException.

My pixels[] has a cap of 1,555,200, while the area of a 1080p video is 2073600

So my buffer is exactly .75 of the area of my screen - could I still have a pixel formatting issues after changing

AV_PIX_FMT_RGB24 to AV_PIC_FMT_ARGB?

@alsaleem00
Copy link

1,555,200 = 4 x 1080 x 360, try debug:

`
void callback(JNIEnv *env, uint8_t *buf, int nChannel, int width, int height) {

int len = nChannel * width * height;
__android_log_print(ANDROID_LOG_ERROR, TAG, "Ch %d, w %d, h %d, all %", nChannel, width, height, len);
jbyteArray gByteArray = env->NewByteArray(len);
env->SetByteArrayRegion(gByteArray, 0, len, (jbyte *) buf);
env->CallVoidMethod(gCallback, gCallbackMethodId, gByteArray, nChannel, width, height);
env->DeleteLocalRef(gByteArray);

}
`

@dylanatkeylogic
Copy link

dylanatkeylogic commented Mar 29, 2022

Ch 3, w 1920, h 1080, all 6220800

On that java side, intBuf.remaining is: 1555200.

Then an ArrayIndexOutOfBoundsException at

Bitmap bmp = Bitmap.createBitmap(pixels, width, height, Bitmap.Config.ARGB_8888

Thank you so much for helping by the way - I'm learning a lot here :)

@dylanatkeylogic
Copy link

Update: In the actual callback call in native-lib.cpp, I upped the channel to 4

callback(env, picture_buf2, 4, ccontext->width, ccontext->height)

Now I do have an image but I'm maybe 2fps and it crashes after about 10 seconds with no error.

@alsaleem00
Copy link

The crash is because of that many frames stacking up before jave handles the request. Memory gets full.
I got similar situation until i controlled the rate at whcih library sends frames. Possible frame rate usually depends on the mobile hardware. Maybe @rrahmati got better approach already!

`
static double now_ms() {

struct timespec res = {};
clock_gettime(CLOCK_REALTIME, &res);
return 1000.0 * res.tv_sec + (double) res.tv_nsec / 1e6;

}
`

@dylanatkeylogic
Copy link

Still no luck. Very weird!

Any chance you could attach your native-lib.cpp?

@alsaleem00
Copy link

`
extern "C"
jint
Java_com_tge_rakinrtsp_RtspClient_play(JNIEnv *env, jobject, jstring endpoint, jint utime_ms) {
SwsContext img_convert_ctx;
AVFormatContext
context = avformat_alloc_context();
int video_stream_index = -1;

updateTime = (utime_ms < 0)? UPDATE_TIME_MS : utime_ms;

//?av_register_all();
avformat_network_init();

AVDictionary *option = nullptr;
av_dict_set(&option, "rtsp_transport", "tcp", 0);

// Open RTSP
const char *rtspUrl= env->GetStringUTFChars(endpoint, JNI_FALSE);
if (int err = avformat_open_input(&context, rtspUrl, nullptr, &option) != 0) {
    __android_log_print(ANDROID_LOG_ERROR, TAG, "Cannot open input %s, error code: %d", rtspUrl, err);
    avformat_free_context(context);
    av_dict_free(&option);
    return JNI_ERR;
}
env->ReleaseStringUTFChars(endpoint, rtspUrl);

av_dict_free(&option);

if (avformat_find_stream_info(context, nullptr) < 0){
    __android_log_print(ANDROID_LOG_ERROR, TAG, "Cannot find stream info");
    avformat_free_context(context);
    return JNI_ERR;
}

// Search video stream
for (int i = 0; i < context->nb_streams; i++) {
    if (context->streams[i]->codec->codec_type == AVMEDIA_TYPE_VIDEO)
        video_stream_index = i;
}

if (video_stream_index == -1) {
    __android_log_print(ANDROID_LOG_ERROR, TAG, "Video stream not found");
    avformat_free_context(context);
    return JNI_ERR;
}

AVPacket packet;
av_init_packet(&packet);

// Open output file
AVFormatContext *oc = avformat_alloc_context();
AVStream *stream = nullptr;

// Start reading packets from stream and write them to file
av_read_play(context);

AVCodec *codec;
codec = avcodec_find_decoder(AV_CODEC_ID_H264);
if (!codec) {
    __android_log_print(ANDROID_LOG_ERROR, TAG, "Cannot find decoder H264");
    avformat_free_context(context);
    return JNI_ERR;
}

AVCodecContext* ccontext = avcodec_alloc_context3(nullptr);
avcodec_get_context_defaults3(ccontext, codec);
avcodec_copy_context(ccontext, context->streams[video_stream_index]->codec);

if (avcodec_open2(ccontext, codec, nullptr) < 0) {
    __android_log_print(ANDROID_LOG_ERROR, TAG, "Cannot open codec");
    avformat_free_context(context);
    avcodec_free_context(&ccontext);
    return JNI_ERR;
}

img_convert_ctx = sws_getContext(ccontext->width, ccontext->height, ccontext->pix_fmt, ccontext->width, ccontext->height,
                                 AV_PIX_FMT_ARGB, SWS_BICUBIC, nullptr, nullptr, nullptr);

size_t size = avpicture_get_size(AV_PIX_FMT_YUV420P, ccontext->width, ccontext->height);
uint8_t *picture_buf = (uint8_t*)(av_malloc(size));
AVFrame *pic = av_frame_alloc();
AVFrame *picrgb = av_frame_alloc();
size_t size2 = avpicture_get_size(AV_PIX_FMT_ARGB, ccontext->width, ccontext->height);
uint8_t *picture_buf2 = (uint8_t*)(av_malloc(size2));
avpicture_fill( (AVPicture*) pic, picture_buf, AV_PIX_FMT_YUV420P, ccontext->width, ccontext->height );
avpicture_fill( (AVPicture*) picrgb, picture_buf2, AV_PIX_FMT_ARGB, ccontext->width, ccontext->height );

__android_log_print(ANDROID_LOG_INFO, TAG, "Start RTSP streaming %d msec", updateTime);
isStop = false;
double tstart = now_ms();
double curtime;
while (!isStop && av_read_frame(context, &packet) >= 0) {
    if (packet.stream_index == video_stream_index) { // Packet is video
        if (stream == nullptr) {
            stream = avformat_new_stream(oc, context->streams[video_stream_index]->codec->codec);
            avcodec_copy_context(stream->codec, context->streams[video_stream_index]->codec);
            stream->sample_aspect_ratio = context->streams[video_stream_index]->codec->sample_aspect_ratio;
        }

        int check = 0;
        packet.stream_index = stream->id;
        avcodec_decode_video2(ccontext, pic, &check, &packet);
        sws_scale(img_convert_ctx, (const uint8_t * const *)pic->data, pic->linesize, 0, ccontext->height, picrgb->data, picrgb->linesize);

        curtime = now_ms();
        if ((curtime - tstart) > updateTime) {
            tstart = curtime;
            callback(env, picture_buf2, 4, ccontext->width, ccontext->height);
        }
    }
    av_free_packet(&packet);
    av_init_packet(&packet);
}

av_free(pic);
av_free(picrgb);
av_free(picture_buf);
av_free(picture_buf2);

av_read_pause(context);
avio_close(oc->pb);
avformat_free_context(oc);
avformat_close_input(&context);

// signal terminated
callback(env, nullptr, 0, 0, 0);

return isStop ? JNI_OK : JNI_ERR;

}
`

@alsaleem00
Copy link

#define UPDATE_TIME_MS (150)
bool isStop = false;
int updateTime;

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants