The only issue now is that my CSV table needs to be flipped on a vertical line. The -180 longitude is currently on the leftmost column and the the 180 longitude in on the right. It needs to be the other way around. I tried to change this with the API but it only allows +ve increments to the latitude/longitude values.
I'll either need to resolve this flip here or change the next phase of bilinear interpolation code to suit the flipped CSV.
When I integrated the JSON scrape into my Bilinear Interpolation code I'd shared earlier, I realised that all I needed to do was change the way the interpolation code reads the data in. Instead of reading it in descending order (180 degrees through to -180 degrees), I simply told it to read it as if it was ascending (-180 degrees through to 180 degrees). Ultimately, no need to compute a mirror.
My biggest gripe right now is the way I translate my long, single list (from the API scrape) into a multi-line string for saving as a CSV and then reloading to convert it back to a list of lists using the "list from csv table" block. It's very clunky.
Ideally, I'd "simply" write a procedure that parses a single list into a list of lists of X*Y dimensions. If anyone has any ideas on this, you're most welcome to share.
Thank you. It works as expected. I see the blocks add each item from the long list into a temporary list until it reaches the width parameter and then it adds those items to a new output list, repeating until all of the items in the original list have been parsed.
Now. I'm grappling with a way to save the list of lists externally using my original plan which was as a CSV (essentially a text file). But I feel this approach might not be appropriate anymore because saving my list of lists to a text string depreciates the list i.e. I can't simply write a string to a file in the "format" of a list and expect the system to interpret the string as a list when I load it back in.
I should probably be using something like a TinyDB to save the list of lists. I think @TIMAI2 's post here might give me what I need. I'll review and try to understand that to see how it goes and report back.
Reporting back. I think i'm at the end of this thread with a great outcome.
Simply storing my API scraped JSON data as a list of lists, stored in a TinyDB has provided the best outcome. No need to convert the JSON to CSV and back to a list - that was only creating more work that didn't need to be done.
Always good to stop, take stock and ask yourself (numerous times) "why am I doing this"?
Thank you @ABG for sticking with me through this thread and providing your guidance.