Fixing Imported Data

When we first started, the data importer for SimplyPlural was in a bit of a rougher place than it is today. Due to that, we have noticed some duplicated data in our testing environment, and have realized that while doing development we are lucky. Locally when building PluralSpace, we can just refresh our database and re-insert fake data for testing.

Well, in prod that isn’t as easy. We are building a tool that will help you with that. If you had imported data and ended up with duplicates of members, journals, anything, we are working on a tool that will help de-dupe that for you.

This tool will help reconcile down so you only have what you should. It is 100% local, doesn’t decrypt any of your information, and will let you know how many duplicates were found, and allow you to remove them.

By default, they will go into an “archived” state for 30 days, after 30 days anything in that archived state that is marked as a duplicate will be removed. If you notice that something got archived that shouldn’t have been, you can always go through the archived data and restore it before that 30 days is up.

NOTE
This isn’t changing how archived members works. This is only changing how we handle duplicated data that got there from the importer. If you never run this tool, you will continue to have duplicated data. You can still manually archive a member anytime, they will never be fully removed unless you click the “Delete” button.

Please authenticate to join the conversation.

Upvoters
Status

In Progress

Board
🐛

Bugs

Date

About 4 hours ago

Author

PluralSpace

Subscribe to post

Get notified by email when there are changes.