I'm sure they have devices where you add a pure compound, and it automatically tests the solubility of the sample in a wide range of solvents and logs the data. Why isn't there a network of these devices connected to an internet database? It would eliminate human error so the values would be more reliable than human logged data, and there would be no more of this wasting time on manual solubility tests and/or scouring the internet to find solubility info for a particular compound. I have a handful of uncommon compounds that I'd enter into the database.
As it is now, I use things like the internet or the merck index and most of the time all I can find is "compound is sparingly soluble in water, readily soluble in acetone" etc. That tells you very little, if on the other hand you had a big table with actual numerical values for a wide range of solvents, including uncommon ones, life would be so much easier for research chemists. They're started things like the Cambridge structural database which is revolutionizing things, but I can think of a thousand other databases that would be equally useful. I'm guessing some people here have the connections to start up a project like this, any reason why not? I reckon the government would fund it. Have some competent companies build the devices and ensure they're calibrated right, and put one in every university then just sit back and watch the database grow.
EDIT: On top of making solubility info readily available, we could develop software which uses complex algorithms to find trends in solubility, and this may lead to a new paradigm in our understanding on solubility. With a database containing say a million compounds and a few thousand solvents, I reckon a super computer could find some very interesting trends that no human could find. I'd personally start with finding out with salts. How do various counterions affect the solubility of compounds, and how can find the universal trends that'll allow us to predict the data for any unknown molecule.