Every fixed-pixel* display "upconverts". How else would they be able to fit 720x480** standard definition to their 1366x768*** native resolution? The question is, which device does it better? I find it odd to assume that every DVD player does it better than any display device, especially when DVD hardware often costs less than $100 and displays cost more than $1000.
* Fixed-pixel display as in LCDs, Plasmas, and DLPs
** Replace with your favorite standard such as 720x576 for PAL etc
*** Replace with your favorite display's pixel count such as 1920x1080 for "FullHD"
edit: Old-school CRT displays directly scan whatever they get if it's within their scan range****, in which case an "upconverting" DVD player would be useful for eliminating visible scan lines or flicker. Remember that upconverting doesn't magically add any additional detail. Real HD will always look more detailed.
*** Most CRT televisions only accept 480-line or 576-line input at 60 or 50 Herz interlaced respectively, in which case an upconverting DVD player wouldn't be of any use. CRT TVs that have component input often accept 1080-line input at 60 Hz interlaced or 480-line input at 60 Hz progressive, in which case an upconverting DVD player would improve the image a little. Computer CRT monitors accept nearly anything. My 10-year old monitor works great on 1080-line input at 60 Hz progressive (so-called "1080p"). In fact, it accepts up to 1536 lines at 60 Hz progressive (my video card can't generate more lines than that). Too bad monitors are tiny.