Export CSV - Rest Call or Python

Hello,
I have looked around for this one and cannot find an exact scenario post.
But I was curious if there was a programmatic way to call the provided “Export to CSV” command that tables include.

The context is production wants to export only what is visible and selected on the table exactly as it looks on the page.
Whether the route is a simple exposed rest call or python integration, or an indirect webscrapping option, I am curious what the Autodesk integration is doing.
Even if I could just mimic what the “Export CSV” does into memory before it is written out, that would be incredible.

However, what I am aiming to avoid is a severely recursive set of shotgun_api3 calls as the order and the visibility of the current webpage seems to not be available.

Any insight is appreciated greatly.

Thank you

EDIT: This post I made was misleading. See the responses below!

Hi @LandonJPG,

TL;DR, unfortunately I don’t think so. But what do I know?

I think the key thing that’s missing is the ability to use the APIs to access the state of a page a user is on.

I’ve wanted to have a similar ability for various reasons, but unfortunately, AFAIK there has never seemed to be a way to access page layouts, or active filters from the python API, and I would assume the REST API has the same limitation. You CAN get selection information from Action Menu Items, or a toolkit app that runs within tk-shotgun, but the info about what the page looks like is not accessible I believe.

I would love access to this information so that I could “copy” pages from one site to another, but AFAIK it’s not currently possible.

I’m not sure exactly what the problem and application is you’re trying to tackle, but my first thought would be that from the user perspective, the only way to get something working would be a tool where they export the CSV and feed it into an application which reads your schema, looks at the data in the spreadsheet, goes and finds the fields that the headers correspond to, re-queries all the values, and finally feeds that to whatever destination it’s going.

Messy, if that’s the only way to do it. That’s all I know and I could be wrong, so take it with a grain of salt.

Best of luck!
-Dashiel

Also I’m not a web developer at all, but your idea about webscraping is intriguing but sounds complicated.

You can get the state of the page with an AMI.

1 Like

Ah-hah! So you can!

Thanks for the clarification, Ricardo.

1 Like

Hey all,
Thanks for the responses.
@Ricardo_Musch is the payload from an AMI enough information to reconstruct from? or does is it a specific key in the payload I am overlooking?
Mind pointing me a direction with example documentation or code?

Im wanting to avoid the csv export route as fundamentally it adds more steps to what I am hoping can be a simple command for a user. and yeah web scrapping just, not pleasant.

2 Likes

Have a look at this library

1 Like