Hello Peter, that is about how much text you can put into a Text Block (or Label or TextBox), nothing specific to importing a CSV, unless you are simply importing the whole file into a Text Block (which you shouldn't be).
....and that is concerned with the method, which attaches to a URL and thus the limit is the max length of a URL - and as Taifun has pointed out right there, the method is now obsolete anyway.
It's possible that there is something wrong with the CSV file itself. If you post a representative large file and just the code snippet you use to process the file, we can help you more.
Many thanks Anke - There must be something very wrong with what I'm doing and I'm guessing it has to do with Chris' observation that I shouldn't be loading all the text into a single block. I've been looking for alternatives and must be missing something obvious. I've finished listening to Beethovens 5th and my latest 100,000 record test database still hasn't loaded. At Chris' suggestion I'm sending him an image of my cut down method and the test file that I know is not corrupted i.e. it works as a small file but extending it to a large file either takes far too long or is not working.
The problem is not reding the text, but displaying it in a Label.
It needs about 2 sec (and 5 sec for csv file of 10MB) to read. You can save the text in a variable "data":
Thanks for the explanation Chris and Tim. Anke has demonstrated that large files can be handled in about 12 seconds so there must be something wrong with my method. I've attached a cut down version of my database read write operations and a test file that I believe is not corrupted as it works when it has a small number of records i.e 15,000 but not when it is extended to 100,000. I am reading the records as a single text file using the Read File / Got Text component so that may be the problem if I understand you correctly. I'd appreciate any advice you can give, MyDB8.csv (2.6 MB)
It might go against your storage approach, but I would prefer to store the data from the csv into a list, then store the list as a whole in the tinydb. Call back the list from the tinydb and work with the list, as opposed to 100,000 tags in the tinydb. I haven't dealt with saving back to the csv because I loaded it in the assets, but all you do is save the list to the file (using a "csv table from list" block
I don't display the text in a label but do save it in a list and then transfer it to TinyDB - perhaps that is where the excessive delay is and where there may be a better method? Once TinyDB is loaded up with the database it performs quickly and well but it has always taken a long time (half an hour or more) to read the external database into TinyDB.
It could be that the excessive TinyDB load time is just too long for the phone to handle without going to sleep or shutting down?
Chris, Thanks for that suggestion - would it still allow the App user to search on tags or key words in the list? That's a critical function in my application, searching and retrieving a record has to be as simple as possible as the user will be working outdoors in a dark environment and will be focused on setting up and using the telescope rather than manipulating lists.
Would it be reasonable after the csv has been rapidly loaded to convert the list to a tinyDB format with tags? My experience with tiny db so far is that once loaded it is very fast at searching and retrieving records.
I use SQL occasionally - not all that user friendly in the environment i'm thinking of and each record only has four items so SQL might be overkill - once loaded tiny db has worked well for over the last year or two.
"Chris, Thanks for that suggestion" - did you mean me ?
These blocks search through the records in all fields for the search term, and return the first find. This could be developed to return a list of records that contain the search term for a more refined selection. All depends on how unique records and fields are. Searching on just a unique fields would return a single result.
Thanks to everyone, especially Tim for his List approach which works very well and Anke.for demonstrating that the problem was in the processing and not the data file read operation. Also apologies for getting Tim's name wrong - it was 2 am in Australia.
I've tried Tim's list approach - It's way faster than Tiny DB and fits with my App software well. Both Tiny DB and Tim's List approach retrieve records from the database quickly, Tiny DB has almost instant record retrieval, and the list approach takes about 15 seconds to retrieve records that are near the bottom of a 100,000 record list. Still perfectly acceptable for my application. Tim's approach sets up the 100,000 record list in seconds Tiny DB takes hours to assign tags to the database.
Unless my App is flawed I'd suggest the practical limit for tiny DB is about 2,000 records. It takes about 90 seconds to set up 2000 tags. The test blocks I used are copied below. I used the 100,000 record database packaged with the APK and controlled the number of records assigned tags using the text box called File Size.
Works beautifully Tim and fits with my App well see a summary at the end of the thread and my apologies for getting your name wrong - it was 2am in Australia.
Thanks for the suggestion ABG. Many special purpose astronomical databases change regularly as astronomical objects can have a number of variables and no astronomy database is complete. The one I've set up is editable by the user and accessible via Excel. My experience of SQL (which I used frequently at work) is that it's a powerful way of developing complex enquiries but not particularly fast and not particularly user friendly.
Q9 : I have application with many thousands of rows (now ower 100 000) in TinyDB and TinyDB is really slow. Need to upgrade. Will be your SQLite fastest to TinyDB?
... I now received the following feedback: SQLite is many many times faster. For example fillup 100,000 lines: TinyDB over one hour, SQLite 6 minutes. And fully fledged work with SQL commands - priceless. Thanks for your Extension.