Media timecode framerate not matching media framerate

Hi all,

I’ve been having an issue with RV defaulting source media timecode framerate to 24fps despite source media being at a different framerate.
The current example I’m working with is a 23.976fps quicktime mov. Loading into RV and displaying the Image Info overlay, I can see the FPS is recognized correctly as 23.976, but the Timecode/Frame Rate is displaying as 24. This is also reflected in the source frame vs source timecode in the timeline.

media_info_edit

This obviously impacts the source timecode to frame conversion, and makes things a bit messy when trying to set cut in/out frame values based on EDL timecodes. Right now I’m hard coding the timecode to frame calculation assuming 24fps despite the project and source media being at a different framerate.

I’ve set RV’s default framerate to 23.976, but this doesn’t appear to have any impact.

Perhaps its an issue with my source media, but I doubt it. I’ve checked the video and timecode stream data using ffprobe, which shows both at 23.976. Ffprobe stream output attached below, incase I’m interpreting incorrectly.

video stream

tc_stream

Any tips on how to force RV to treat the source timecode based on the correct framerate would be much appreciated!