This question has been answered by okyanet. See the first response.
<? /** * @author OrangeLine - Lazylegs * @copyright 2011 * @package MultiInsert */ $path = MODX_CORE_PATH . 'components/geo/'; $result = $modx->addPackage('geo',$path . 'model/','geo_'); if (!$result) { return 'failed'; } $time = explode(' ', microtime()); $time = $time[1] + $time[0]; // return array $begintime = $time; //define begin time $handler = fopen('/absolute/path/to/file/lazy_geo.csv', "r"); $i = 0; while(($data = fgetcsv($handler, 512, ";", '"'))) { $q = $modx->newObject('Places', array("suburb" => $data[0], "latitude" => $data[1], "longitude" => $data[2])); $q->save(); $i++; } $time = explode(" ", microtime()); $time = $time[1] + $time[0]; $endtime = $time; //define end time $totaltime = ($endtime - $begintime); //decrease to get total time return $totaltime.' seconds and ' . $i . ' rows were inserted'; //echo it to appear in browser
<?php /** * @author OrangeLine - Lazylegs * @copyright 2011 * @package MultiInsert */ $path = MODX_CORE_PATH . 'components/geo/'; $result = $modx->addPackage('geo',$path . 'model/','geo_'); if (!$result) { return 'failed'; } $table = $modx->getTableName('Places'); $sql = "INSERT INTO {$table} (suburb, latitude, longitude) VALUES (?, ?, ?)"; $q = $modx->prepare($sql); $time = explode(' ', microtime()); $time = $time[1] + $time[0]; // return array $begintime = $time; //define begin time $handler = fopen('/absolute/path/to/file/lazy_geo.csv', "r"); $i = 0; while(($data = fgetcsv($handler, 512, ";", '"'))) { $q->bindValue(1, $data[0]); $q->bindValue(2, $data[1]); $q->bindValue(3, $data[2]); $q->execute(); $i++; } $time = explode(" ", microtime()); $time = $time[1] + $time[0]; $endtime = $time; //define end time $totaltime = ($endtime - $begintime); //decrease to get total time return $totaltime.' seconds and ' . $i . ' rows were inserted'; //echo it to appear in browser
You mean like adding a whole table, pre-populated?
Otherwise another forum member successfully ran ImportX for over 7000 records but needed to change the php timeout limit. That just runs save() for every record and seemed to work ok...
I would not recommend using using xPDO objects directly to insert lot of data. Like stated above, you need to change php timeout to prevent dying of code.
Results
- Run #1: 6.1418168544769 seconds and 63214 rows were inserted
- Run #2: 5.9312469959259 seconds and 63214 rows were inserted
- Run #3: 5.4812779426575 seconds and 63214 rows were inserted
- Run #4: 5.7925748825073 seconds and 63214 rows were inserted
- Run #5: 5.6238942146301 seconds and 63214 rows were inserted
And same run but changing the $q->bindValue()'s to array on $q->execute(array($data[0], $data[1], $data[2]));
Results
- Run #1: 5.6559760570526 seconds and 63214 rows were inserted
- Run #2: 5.5258798599243 seconds and 63214 rows were inserted
- Run #3: 5.4761080741882 seconds and 63214 rows were inserted
- Run #4: 5.466236114502 seconds and 63214 rows were inserted
- Run #5: 5.6207709312439 seconds and 63214 rows were inserted
So quick sum up, if you need to just insert data to database in big lumps. Do avoid newObject() unless you are 100% sure you can insert all data within boundaries or break down the insert into smaller amounts at a time.