Today, SEO is swiftly approaching saturation point. More and more webmasters realise necessity of learning SEO basics, and as they do so, SEO professionals are facing difficulties finding new clients. With all niche sites optimised, it will be harder to compete for good key phrases. Link building opportunities will be easily found and utilised by everyone, keyword density will reach its optimum value, meaning that SERPs will consist of equally good and equally relevant sites - at least from traditional SEO point of view.Spammy techniques, still popular and sometimes even effective, will exhaust themselves even quicker. There are, really, not so many different methods of deceiving search engines and increasing a site's relevancy artificially; today they just differ in details. Perhaps it explains why we don't see spammy sites in SERPs as often as we used to - our smart spiders catch them quite soon and throw this low-rate stuff away to keep web cleaner. As soon as spiders become smart enough to recognise spam on fly, particular class of "SEO specialists" propagating such rubbish will find themselves out of their jobs. It is not really hard to tell an ugly doorway from real thing.
So who will survive? What is way to tomorrow in SEO science?
First of all, we should monitor and analyse latest tendencies, then extrapolate them and make good guesses on how things may look in future. Finally, we put them to test using logic and common sense.
This will show us true answers and help us compete when time comes to offering ground-breaking SEO services that exploit new qualities of search engines.
And common sense tells us that core purpose of search engines will never change. They are supposed to deliver best results they can. If they are not always so good at it today, it is often explained by their restricted resources; but that will change over time.
The search engines of future will be capable of reading JavaScript, CSS, Flash and other things that are invisible to them now. It is technically possible already, but requires more complicated algorithms and more bandwidth, so they are not so eager to implement it just yet. They prefer to sacrifice additional capabilities in favour of spiders' speed and freshness of their indices. But as technical factors improve, SEs will improve and create new sensations every day, all more so since they always have to compete with each other.
Thus, JavaScript links will count. CSS spam will be easily detected and banned. Flash sites will become a new niche for SEO specialists - at moment they require an HTML version to subject to search engine optimisation.
But these changes are not most important ones. Link popularity analysis algorithms are sure to become more sophisticated - and capable of analysing "likeliness" of one or another link pattern given information on a site's age, size and content. That will mean death to link schemes, link farms, pyramids, automated submissions, and numerous links with same anchor text - and, perhaps, shake basis of today's reciprocal linking strategies. Relevancy will mean more, and in cases of complementary businesses linking their sites to each other, search engines will become capable of seeing if they are really complementary, not just pretending to be so.