$file = 'assets/files/modx_user_attributes_extended.json'; if(is_file($file)) { $json = file_get_contents($file); $array = explode('},',$json); foreach($array as $key=>$value) { $value = $value . '}'; // replace the closing } for the JSON rows $row = json_decode($value); // get objects $userId = $row->internalKey; $user = $modx->getObject('modUser',$userId); if (!$user) continue; $profile = $user->getOne('Profile'); if (!$profile) continue; $fields = $profile->get('extended'); $fields['value1'] = $row->value1; $fields['value2'] = $row->value2; ... $profile->set('extended',$fields); $profile->save(); } } return;
Ok. Here's how I got the Evo WebLoginPD user_attributes_extended table converted into JSON strings and loaded in to the Revo user_attributes table's extended field.
- Use phpMyAdmin to get a JSON dump of the user_attributes_extended table.
- Edit the .json file and delete the leading comments and the enclosing [ ] tags.
- Load it up and process it in a snippet - ugly, and I had to increase my max_execution_time for it, but it worked.
$file = 'assets/files/modx_user_attributes_extended.json'; if(is_file($file)) { $json = file_get_contents($file); $array = explode('},',$json); foreach($array as $key=>$value) { $value = $value . '}'; // replace the closing } for the JSON rows $row = json_decode($value); // get objects $userId = $row->internalKey; $user = $modx->getObject('modUser',$userId); if (!$user) continue; $profile = $user->getOne('Profile'); if (!$profile) continue; $fields = $profile->get('extended'); $fields['value1'] = $row->value1; $fields['value2'] = $row->value2; ... $profile->set('extended',$fields); $profile->save(); } } return;
So 10,328 users imported. Now that I know how, it'll only take a few minutes to do it again for the dev site just before we take it live.