What is my opinion? I think that Google is not new to providing useful results, but clearly defining user intent is a hard thing to predict. Can you use a combination of on-site (the publisher) and off-site (the public, competitors and affiliates) signals to determine the most accurate result for a search? I think so. In search engines the quality measure is to return exactly what the user is looking for, ranked from most to least relevant and considering the unpredictability of language that this person will have.
The only real way to define relevant is to add user-data to the equation. It totally makes sense in terms of diminishing the power of spam.
The one thing that I would hope is for Google to use not only stickiness, but also user experience itself. Not only time in site, but conversion information. If stickiness was the case then by creating longer landing pages people would invest an extra second in finding if the result was relevant or not.
Finally a trend that I believe to be true is that users are getting better at using search engines. At the beginning the language used to search was different than the one used these days (take a look at the concept of long-tails). This change in behavior should help engines in returning better results as the web grows.
I think that’s why Google should support SEO. Because the optimization of sites will ultimately share the good practices of the web and increase the users awareness of the SE capabilities.