I have constructed the following Regex, which allows strings that only satisfy all three conditions:
- Allows alphanumeric characters.
- Allows special characters defined in the Regex.
- String length must be min 8 and max 20 characters.
The Regex is:
"^(?=.*[a-z])(?=.*[A-Z])(?=.*\d)(?=.*[$@$!%*?&])[A-Za-z\d$@$!%*?&]$"
I use the following Javascript code to verify input:
var regPassword = new RegExp("^(?=.*[a-z])(?=.*[A-Z])(?=.*\d)(?=.*[$@$!%*?&])[A-Za-z\d$@$!%*?&]$");
regPassword.test(form.passwordField.value);
The test() method returns false for such inputs as abc123!ZXCBN. I have tried to locate the problem in the Regex without any success. What causes the Regex validation to fail?
/^[a-z0-9!@#$%]{8,20}$/i. But I have to ask: why so fixated on using specifically a single regex that is de facto beyond your current skill level? That would only make your code harder to understand, debug, and maintain.