Q: Our system has over 700 possible duplicate records. What are your recommendations for cleaning them up?
A: That is actually a fairly tough question to answer. My standard advice is not to try to tackle it all at once. When you think about trying to clean up a large number of records, it can be intimidating, and slow progress can be discouraging. If you break it down into smaller sets, it can make it easier to tackle. For instance, commit to spending 10-15 minutes each day to work through your list. Assuming you can get through ten per day, you should be done in just 14 weeks! That may seem like a lot of time, but in the end, you’ll have much cleaner data. Now with that said, there are more and more systems that are trying to automate this process using matching algorithms. This can seem like a nice, easy way to work through all your duplicates, but you need to be careful when using any automated tool. I would recommend only using an automated process for duplicates that the system considers 100% matches and manually reviewing the rest. This will help to cut down on the risk of combining two constituents who may not actually be the same person. I hope this is helpful and feel free to reach out if you have any specific questions on the process.