L
lambertb
Hi all,
my knowledge on regular expression is average. I have this one in
asp.net :
^\d+(\.\d\d?)?
To my eyes, it correctly validates any whole number (ex: 1, 100), and
any decimal number with one or two decimals (42.0, 42.42, 420.42, etc.)
Why is in javascript, declared like that :
var regex = /^\d+(\.\d\d?)?/
outputs this :
regex.text('abc') // outputs false, correct
regex.test('42er') // outputs true, INCORRECT (chars are alpha)
regex.test('42.420') // outputs true, INCORRECT (3 digits after
point)
Anyone have suggestions please?
Thanks!
ibiza
my knowledge on regular expression is average. I have this one in
asp.net :
^\d+(\.\d\d?)?
To my eyes, it correctly validates any whole number (ex: 1, 100), and
any decimal number with one or two decimals (42.0, 42.42, 420.42, etc.)
Why is in javascript, declared like that :
var regex = /^\d+(\.\d\d?)?/
outputs this :
regex.text('abc') // outputs false, correct
regex.test('42er') // outputs true, INCORRECT (chars are alpha)
regex.test('42.420') // outputs true, INCORRECT (3 digits after
point)
Anyone have suggestions please?
Thanks!
ibiza