General SQL Development
 
Forums: » Register « |  User CP |  Games |  Calendar |  Members |  FAQs |  Sitemap |  Support | 
 
User Name:
Password:
Remember me
 



Go Back   Dev Articles Community ForumsDatabasesGeneral SQL Development

Reply
Add This Thread To:
  Del.icio.us   Digg   Google   Spurl   Blink   Furl   Simpy   Y! MyWeb 
Thread Tools Search this Thread Display Modes
 
Unread Dev Articles Community Forums Sponsor:
  #1  
Old October 8th, 2002, 10:14 PM
dennisj dennisj is offline
Registered User
Dev Articles Newbie (0 - 499 posts)
 
Join Date: Aug 2002
Location: Carlsbad, CA
Posts: 2 dennisj User rank is Just a Lowly Private (1 - 20 Reputation Level) 
Time spent in forums: < 1 sec
Reputation Power: 0
How to Delete Duplicate Records?

In a MySQL (3.22.51) file I have three fields

idx (Auto Increment)
filename
pathname

I use an insert command in a dump.sql file to add new records to my file.

INSERT INTO jpeglist (idx, filename, pathname) VALUES ( '', 'c-purpsparkle26w-1.jpg', 'dennisj/20021007');

Sometimes I end up adding the same record twice.

How can I find all duplicate records in Field 'filename' and delete the extra records?

Can I change the INSERT INTO command so new records are added and duplicate records are only updated?

TIA

Dennis

Reply With Quote
  #2  
Old October 9th, 2002, 03:56 AM
Lindset Lindset is offline
weirdomoderator
Dev Articles Newbie (0 - 499 posts)
 
Join Date: Jun 2002
Location: Alta, Norway
Posts: 370 Lindset User rank is Just a Lowly Private (1 - 20 Reputation Level) 
Time spent in forums: < 1 sec
Reputation Power: 13
Send a message via ICQ to Lindset Send a message via AIM to Lindset
Maybe you'll find this interesting?
http://www.mysql.com/doc/en/REPLACE.html
__________________
Best Regards,
Håvard Lindset

Reply With Quote
  #3  
Old October 31st, 2002, 06:10 AM
sylow sylow is offline
Registered User
Dev Articles Newbie (0 - 499 posts)
 
Join Date: May 2002
Location: Alkmaar, Netherlands
Posts: 11 sylow User rank is Just a Lowly Private (1 - 20 Reputation Level) 
Time spent in forums: < 1 sec
Reputation Power: 0
Send a message via ICQ to sylow
SELECT *, count(*) cnt FROM tableName GROUP BY fieldname1, fieldname2, .... HAVING cnt > 1

This will return all the records in the tableName more than 1 time. Include all the fields except the auto increment field in GROUP BY part.

Rest is trivial

Reply With Quote
  #4  
Old January 7th, 2003, 01:46 PM
clivelr clivelr is offline
Registered User
Dev Articles Newbie (0 - 499 posts)
 
Join Date: Jun 2002
Location: New Zealand
Posts: 3 clivelr User rank is Just a Lowly Private (1 - 20 Reputation Level) 
Time spent in forums: < 1 sec
Reputation Power: 0
Send a message via ICQ to clivelr
Why not use a unique index on the table?

That way when it comes to actually inserting the record:
"INSERT INTO jpeglist (idx, filename, pathname) VALUES ( '', 'c-purpsparkle26w-1.jpg', 'dennisj/20021007');"
it will complain about this being a duplicate record and will not insert a new record. This will elliminate duplicate files without actually having to build any code for it?

Thanx
Clive

Reply With Quote
  #5  
Old June 6th, 2005, 02:00 AM
Turbo Turbo is offline
Registered User
Dev Articles Newbie (0 - 499 posts)
 
Join Date: Dec 2004
Posts: 2 Turbo User rank is Just a Lowly Private (1 - 20 Reputation Level) 
Time spent in forums: 9 m 46 sec
Reputation Power: 0
Every man has different problems vith files. I know how to solve the problem with duplicate files. I use Duplicate checker PRO 6.0 for such purposes (http://www.atory.com/Dupe_Checker_PRO/). I suggest you to try it if you experience "dupe-problem"...

Good luck!

Reply With Quote
  #6  
Old June 14th, 2005, 07:12 PM
Madpawn Madpawn is offline
My beat is correct.
Dev Articles Newbie (0 - 499 posts)
 
Join Date: Dec 2004
Posts: 339 Madpawn User rank is Private First Class (20 - 50 Reputation Level)Madpawn User rank is Private First Class (20 - 50 Reputation Level) 
Time spent in forums: 2 Days 22 h 3 m 33 sec
Reputation Power: 10
You've exhumed a long-dead thread to pimp an irrelevant product (the problem is duplicate records in a database, not duplicate files).

Please don't do that.
__________________
"A pawn is the most important piece on the chessboard -- to a pawn"


Reply With Quote
  #7  
Old August 9th, 2006, 08:23 AM
D_D D_D is offline
Registered User
Dev Articles Newbie (0 - 499 posts)
 
Join Date: Jan 2006
Posts: 3 D_D User rank is Just a Lowly Private (1 - 20 Reputation Level) 
Time spent in forums: 6 m 31 sec
Reputation Power: 0
Find and delete duplicate files and folders by Dupe Checker PRO

Reply With Quote
  #8  
Old August 9th, 2006, 08:47 AM
MadCowDzz's Avatar
MadCowDzz MadCowDzz is offline
I'm Internet Famous
Dev Articles Frequenter (2500 - 2999 posts)
 
Join Date: Jan 2003
Location: Toronto, Canada
Posts: 2,886 MadCowDzz User rank is Lance Corporal (50 - 100 Reputation Level)MadCowDzz User rank is Lance Corporal (50 - 100 Reputation Level)MadCowDzz User rank is Lance Corporal (50 - 100 Reputation Level) 
Time spent in forums: 1 Week 16 h 19 m 35 sec
Reputation Power: 14
Deja vu.
__________________
Daryl's Homepage | My Blogroll | My Profile | Firefox supporter!
DevArticles Forum Moderator

"The net is a waste of time, and that's exactly what's right about it." -- William Gibson

Reply With Quote
  #9  
Old December 13th, 2010, 12:21 PM
jyotidayal jyotidayal is offline
Registered User
Dev Articles Newbie (0 - 499 posts)
 
Join Date: Dec 2010
Posts: 1 jyotidayal User rank is Just a Lowly Private (1 - 20 Reputation Level) 
Time spent in forums: 10 m 26 sec
Reputation Power: 0
I ran into a problem of duplicate files and someone told me about duplicates-finder.com

Reply With Quote
  #10  
Old December 21st, 2010, 06:16 AM
anitadayal anitadayal is offline
Registered User
Dev Articles Newbie (0 - 499 posts)
 
Join Date: Dec 2010
Posts: 1 anitadayal User rank is Just a Lowly Private (1 - 20 Reputation Level) 
Time spent in forums: 2 m 32 sec
Reputation Power: 0
Delete Long Path File

Hi. I ran into that problem too. And after days of searching I finally found this software: Delete Long Path File Tool.
It's GREAT. You can find it here:
longpathtool

Reply With Quote
  #11  
Old April 22nd, 2011, 04:59 PM
crismanaon crismanaon is offline
Registered User
Dev Articles Newbie (0 - 499 posts)
 
Join Date: Apr 2011
Posts: 4 crismanaon User rank is Just a Lowly Private (1 - 20 Reputation Level) 
Time spent in forums: 30 m 43 sec
Reputation Power: 0
Delete from table_name A
where rowid > (select min(rowid) from table_name B where B.keyword = A.keyword)

Reply With Quote
  #12  
Old November 5th, 2012, 11:25 PM
LuiFigo LuiFigo is offline
Registered User
Dev Articles Newbie (0 - 499 posts)
 
Join Date: Nov 2012
Posts: 1 LuiFigo User rank is Just a Lowly Private (1 - 20 Reputation Level) 
Time spent in forums: 6 m 6 sec
Reputation Power: 0
Quote:
Originally Posted by dennisj
In a MySQL (3.22.51) file I have three fields

idx (Auto Increment)
filename
pathname

I use an insert command in a dump.sql file to add new records to my file.

INSERT INTO jpeglist (idx, filename, pathname) VALUES ( '', 'c-purpsparkle26w-1.jpg', 'dennisj/20021007');

Sometimes I end up adding the same record twice.

How can I find all duplicate records in Field 'filename' and delete the extra records?

Can I change the INSERT INTO command so new records are added and duplicate records are only updated?

TIA

Dennis


So you are looking for a application that delete values from database but for that purpose i think sql is fair enough to delete duplicate values and there are various constraints available in SQL that may help not to enter duplicate values in database.

Reply With Quote
  #13  
Old January 7th, 2013, 07:11 AM
Rkjobdft Rkjobdft is offline
Registered User
Dev Articles Newbie (0 - 499 posts)
 
Join Date: Jan 2013
Posts: 5 Rkjobdft User rank is Just a Lowly Private (1 - 20 Reputation Level) 
Time spent in forums: 13 m 15 sec
Reputation Power: 0
Sometimes I end up adding the same record twice.


Reply With Quote
  #14  
Old January 12th, 2013, 04:06 AM
adamchieef adamchieef is offline
Registered User
Dev Articles Newbie (0 - 499 posts)
 
Join Date: Jan 2013
Posts: 1 adamchieef User rank is Just a Lowly Private (1 - 20 Reputation Level) 
Time spent in forums: 6 m 53 sec
Reputation Power: 0
I use the program from DuplicateFilesDeleter.com to find and delete duplicates. You may try. Thanks

Reply With Quote
  #15  
Old September 16th, 2013, 05:03 AM
vipin4u vipin4u is offline
Registered User
Dev Articles Newbie (0 - 499 posts)
 
Join Date: Sep 2013
Posts: 8 vipin4u User rank is Just a Lowly Private (1 - 20 Reputation Level) 
Time spent in forums: 1 h 29 m 10 sec
Reputation Power: 0
You can use some simple queries to do the same. In postgres I solve the problem with this query:
DELETE FROM table_name WHERE ctid NOT IN (SELECT max(ctid) FROM table_name GROUP BY column1, [column 2,] ) ;
where CTID is a system column having unique values.
You just need to find the mySQL alternative for CTID.

Reply With Quote
Reply

Viewing: Dev Articles Community ForumsDatabasesGeneral SQL Development > How to Delete Duplicate Records?


Developer Shed Advertisers and Affiliates


Thread Tools  Search this Thread 
Search this Thread:

Advanced Search
Display Modes  Rate This Thread 
Rate This Thread:


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
View Your Warnings | New Posts | Latest News | Latest Threads | Shoutbox
Forum Jump

Forums: » Register « |  User CP |  Games |  Calendar |  Members |  FAQs |  Sitemap |  Support | 
  
 


Powered by: vBulletin Version 3.0.5
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.

© 2003-2014 by Developer Shed. All rights reserved. DS Cluster - Follow our Sitemap