Large data set, insert into sqlite


#1

Hi, iam developing an app, which reads tickets number and then checks i it is valid. The problem is, that I need to load valid ticket codes from text file. So i load it from file and try to insert it into sqlite database. With larger datasets(10000 tickets and more), it takes ages. Is there some way make it faster?

Method for opening file, reading it and insert it into db looks like this:

    openFile() {
        this.fileChooser.open().then(uri => {
            this.filePath.resolveNativePath(uri).then((path) => {
                console.log(path);
                let dirPath = path.substring(0, path.lastIndexOf('/'));
                let fileName = path.substring(path.lastIndexOf('/') + 1);
                console.log(dirPath, fileName);
                this.file.readAsText(dirPath, fileName).then((data) => {
                    this.db.deleteAllTickets().then(() => {
                    });
                    let test = data.split('\n');
                    let i=0;
                    test.forEach(ticket => {
                        this.db.insertTicketToTickets(ticket.trim()).then(data=>{
                            console.log("hura",data)
                        },err=>console.log(err));
                        i++;
                        console.log(i);
                    });
                }).catch(err => console.log(err));
            })
        });
    }

And function for insert looks like this:

    insertTicketToTickets(ticketId) {
        return this.storage.executeSql('INSERT INTO tickets(ticketNumber) VALUES (?)', [ticketId]).then((data) => {
            return (true);
        }, (error) => {
            console.log(error);
        });
    }

Any ideas?


#2

I think the way you are doing it is fine. An import of data take a lot of time and the only way to do what you want to faster is on a device with a processors multiple cores.

Does the user has to import those tickets all the time he uses the app?


#3

I’m using uk.co.workingedge.cordova.plugin.sqliteporter to import large amount of data in seconds.


#4

I solve it using sqlbatch for database insert. Now it tooks about 3 s to load file with 25000 records


#5

can you please share the code here…i am struggling to insert 12k data