Um, I realize the digital signal has been "reconstructed" into an analog signal and presented to the ear via the normal chain. However, this signal is like reconstituted food. It might look like it, but it ain't it and I personally believe that the brain knows there is something wrong with what it hears and tries to convince itself that the source is naturally analog, but it is not.
Early CD players, from the early to mid 1980's, used output filters that caused additional noise artifacts in the analog output. This noise was interpreted by the ear as "coldness", "harshness", and "digital glare". Well-designed modern digital players don't have these issues. Compounding the problems with early digital recordings was the fact that recording studio personnel had a steep learning curve with learning how to make good digital recordings, just as film photographers had a steep learning curve when transitioning from film to digital cameras.
Many of the most popular music titles have gone through several "digital re-masterings" to correct the recording errors made in previous digital editions.
I believe that there are people that can perceive digitally sourced music, much as wine tasters can pinpoint where a wine came from (regionally), what vineyard it came from and the year.
With wine, there are "flavor profiles" that can describe and catalog the differences in taste, aroma, appearance, and mouthfeel of a particular vintage. I am not aware of such an effort being made pertaining to digital recordings. Specifically, I don't know of any study that has shown that digital recordings have sound characteristics unique to them and distinct from analog recordings. It is certainly possible to distinguish a bad recording from a good one, if one knows what to listen for. The sound quality defects I have heard on digital recordings I have also heard on bad analog recordings.
Obviously, the rest of the world has issues with the digital format, otherwise, why would there still be constant attempts at making it better. Obviously (again) there must be some "golden ears" that can still hear the graininess of a digital source, no matter how high or deep the sampling rate.
Continuous improvement is not necessarily a sign of poor design. With analog, there are constant efforts at improving turntable materials, construction methods, and drive motors. There are continuous efforts at improving cartridge and tonearm designs.
From a commercial standpoint, it makes a lot of sense to keep improving digital formats since most audio consumers listen to music via digital media, and especially portable digital media such as cell phones and iPods.
I, personally, seem to become bothered by longer periods of music listening sessions with digital as the source. I never seem to have that problem with vinyl.
Listening fatigue is a very real phenomena. However, I would ask about the quality of your digital source (and quality of your digital media) compared to the quality of your vinyl source (and vinyl media). Are they of comparable quality and resolution?