This is actually the announcement; so its “to be released, TBR” :)
Keeping in line with the “dupe” frenzy, I have been working to create tools that will allow servers to better handle dupes. This tool in particular helps you to get rid of duped items on your server. It is partially based on the script created by FD Core Studio; he has created a similar script to find duplicate items on your server.
The script “assumes” that if multiple unique ids exist, they are all duplicate items. I use the same “GROUP BY” method FD Core uses to group the items. However, you do not need 1 GB of RAM to perform this as memory is not utilized as much!
Loading the ItemHistory database, the database which is used to check for duplicate items requires your to access PHP via CLI (command line).

I will be adding the option later on, once the committing of the information is complete, to run an optional “delete items from the server”. This option will basically give you the ability to wipe out all the duplicate items from your server.
You should not run this script too often as it takes a very long time depending on how big of a server you are running and how old of a server you are running; for small servers you will typically see the items being added to the temp DB at around 500,000 items (as the screenshot above is); for me this took around 2-3 minutes to complete.
I am also including a web interface to see exactly what items were duped, how many times, as well as who has the duped items.

Right now however this only looks at the “inventory” table and does not look at the Account Trunk or Trunk Extended, but I’m hoping to expand to that in a later release.

I will update this post with a link for people to download it once it is complete :)

Update [28/09/2010]

I just ran the ItemHistory database building script (script which basically takes all your inventory data and puts it in an ItemHistory file) for 420,556 characters, which is 42,055,600 items in total, on an average computer (remember, higher the computer specs, including processor and hdd speed the better).
It took 2 hours 24 minutes and 38 seconds to complete.
That actually was fairly good! However I believe I can still make improvements in speed and probably cut that down by 30-50%. The actual database size ended up being 1.18 GB and I do not believe I can reduce that any further due to the need for indexing.
At the end of the day however, this is all still being written in PHP. I am probably going to create a C++ app to do this all and it will definitely be much faster.

Update

I have now managed to get the time reduced (for the same, 42m items) to 1 hour 10 minutes and 43 seconds (~52% reduction in time) :)
As well I have decided to scrap the use of SQLite as it does not support cursors and requires all the data to be loaded at once, which is exceptionally bad.  So now the data is loaded into a temp table created on the MSSQL server (RF WORLD DB) and is deleted everytime the script is ran again!