Data governance, quality assurance & labor saving
Did you know .... as well as being a full-time shoulder-to-cry-on for governance matters, I am also a part-time dabbler in web development?
No?
Well, over the past few years myself & my brother (Ronan) launched two social-networky-type sites. The first is a community rating website called LikePlace and the second is Wandermates, an app for connecting solo travelers.
Unlike this brochureware site where QA issues are limited to a few broken links or typos, both LikePlace & Wandermates collect oodles of membership & other data from visitors.
Inevitably it happens that from time-to-time somebody will subscribe more than once, or encounter an issue that adds a duplicate to our database. As a consequence I spend a fair amount of time QAing data, as well as normal things.
Thus far both sites are of a small enough scale that automation is not really necessary. I can remove duplicates manually.
I know of many charities and retailers that collect the details of hundreds-of-thousands of donors or loyalty-card members into massive databases, including names, emails, postal addresses, etc.
It is also well known that such databases are ridden with duplicates. But they are so huge that no amount of manual effort alone can possibly clean them up.
Enter Dedupeyourdata, a new QA tool that can dedupe excel & marketing lists. All you need to do is upload your data in Excel, CSV or any number of other formats and let the deduplication software do the rest. Among its features it can:
As an aid to the maintenance of clean data I consider it an extremely useful addition to the suite of Web QA tools.
No?
Well, over the past few years myself & my brother (Ronan) launched two social-networky-type sites. The first is a community rating website called LikePlace and the second is Wandermates, an app for connecting solo travelers.
Unlike this brochureware site where QA issues are limited to a few broken links or typos, both LikePlace & Wandermates collect oodles of membership & other data from visitors.
Inevitably it happens that from time-to-time somebody will subscribe more than once, or encounter an issue that adds a duplicate to our database. As a consequence I spend a fair amount of time QAing data, as well as normal things.
Thus far both sites are of a small enough scale that automation is not really necessary. I can remove duplicates manually.
But not everyone has that luxury
I know of many charities and retailers that collect the details of hundreds-of-thousands of donors or loyalty-card members into massive databases, including names, emails, postal addresses, etc.
It is also well known that such databases are ridden with duplicates. But they are so huge that no amount of manual effort alone can possibly clean them up.
Something else is needed
Enter Dedupeyourdata, a new QA tool that can dedupe excel & marketing lists. All you need to do is upload your data in Excel, CSV or any number of other formats and let the deduplication software do the rest. Among its features it can:
- Find duplicates in Excel lists
- Match one list against another
- Match using sounds-like
- View all duplicates and scroll through them
- Remove/delete duplicates and download clean lists
As an aid to the maintenance of clean data I consider it an extremely useful addition to the suite of Web QA tools.
<< Home