5

I have a php application that inserts a data into MySQL, which contains a randomly-generated unique value. The string will have about 1 billion possibilities, with probably no more than 1 or 2 million entries at any one time. Essentially, most combinations will not exist in the database.

I'm trying to find the least expensive approach to ensuring a unique value on insert. Specifically, my two options are:

  1. Have a function that generates this unique ID. On each generation, test if the value exists in the database, if yes then re-generate, if no, return value.
  2. Generate random string and attempt insert. If insert fails, test error is 1062 (MySQL duplicate entry X for key Y), re-generate key and insert with new value.

Is it a bad idea to rely upon the MySQL error for re-trying the insert? As I see it, the value will probably be unique, and it seems the initial (using technique 1) would be unnecessary.

EDIT #1

I should have also mentioned, the value must be a 6 character length string, composed of either uppercase letters and/or numbers. They can't be incremental either - they should be random.

EDIT #2

As a side note, I'm trying to create a redemption code for a gift certificate that is difficult to guess. Using numbers and letters creates 36 possibilities for each character, instead of 10 for just numbers or 26 for just letters.

Here's a stripped-down version of the solution I created. The first value entered in the table is the primary key, which is auto incremented. affected_rows() will equal 1 if the insert is successful:

 $code = $build_code();
 while ((INSERT INTO certificates VALUES ('', $code) ON DUPLICATE KEY UPDATE pk = pk) && affected_rows() == 0)
      $code = $build_code();
3
  • did you take a look at uniqid function ? Commented Feb 18, 2012 at 22:09
  • Side discussion: I have met this kind of problem too. A nice solution was to use a random string combined with a timestamp. As a result, the string can't be guessed, and you ensure that there is no collision when generating the string/inserting in database. Commented Feb 18, 2012 at 23:35
  • A rule of thumb, when deciding how big to make your random unique id. Either, make it so big that there is no point to looking for duplicates (there is a higher chance that the hardware is defective and the check gives the wrong answer, than there is of a duplicate). Or make it small enough that duplicates happen all the time (so you can test your code in finite time), and have a check to deal with it. Commented Apr 5, 2013 at 11:30

3 Answers 3

2

Is it a bad idea to rely upon the MySQL error for re-trying the insert?

Nope. Go ahead an use it if you want. In fact many people think if you check and if it doesn't exist then it's safe to insert. But unless you lock the table it's always possible that another process might slip in and grab the id.

So go ahead generate a random id if it suits your purpose. Just make sure you test your code so it does properly handle dups. Might also be useful to log dups just to ensure your assumptions about how unlikey dups are to occur are correct.

Sign up to request clarification or add additional context in comments.

Comments

0

Define your table with unique constraint:

http://dev.mysql.com/doc/refman/5.0/en/constraint-primary-key.html

1 Comment

He already uses that, see the question (MySQL duplicate entry X for key Y)
-1

Why not just use: "YourColName BIGINT AUTO_INCREMENT PRIMARY KEY" to ensure uniqueness?

2 Comments

Because AUTO_INCREMENT generates following numbers, not random (and not containing letters)
I guess I just don't understand why you would want to do this?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.