The best Search Engine on the net is Google http://google. com , so this tutorial is based on Google search engine .
Why Google search engine is best?
Google does not accept "payment for placement". This does not mean Google doesn't allow listers to pay for their listings. But, Google puts these listings in a separate area and it is very clear that they are listed there because they paid to be included when particular search words are used. Many other Search Engines do accept payment for placement, which means these advertisers will show at the top of the results for your search, not because they are necessarily the best result for your search, but because they paid to be there. The more they pay, the higher they appear in the list.
Google's spiders look at more than just keywords. When Google does its homework to update its databases, it uses many criteria to find websites related to specific search words. It sends out spider robots and searches websites for keyword tags, title tags, actual words in the text of the site, words in the alt tags attached to pictures in the sites, and it also looks at how the site is ranked (for popularity). All of these criteria contribute to how high up in the results the site will be. So, when you search Google, the odds that the first few sites listed will actually have the information you want, are much greater than some other search engines. And the popularity rank gives better odds that this site has a good reputation among its peers, since the popularity is determined by how many other sites link to this site.
Google lets you search for more than just websites. At the top of the Google page, you see tabs for Web, Images, Groups, Directory, and News. So, instead of just searching the WWW, you can instead look for pictures related to your search words (using the Images tab), newsgroups where your question may have been answered (Groups tab), Directories related to your search topic (Directories tab), and current news articles about your search topic from just about every major newspaper in the world (News tab).
Google's Advanced Search makes it so easy. No need to know how to use all those Boolean search parameters like AND and OR and NOT, and no need to remember when you need to put the phrase in quotes or parentheses in order to narrow down your search. Just click on the Advanced Search link and look at all your choices:
1. Note that you can search for all of the words you type, and Google will show you only the pages that include every word you typed. Example: searching for excel vlookup will find all the pages that have excel and vlookup in them, but not pages that only have excel in them, nor pages that only have vlookup in them.
2. You can search for an exact phrase, so only pages that include that full phrase or sentence will be found. Example: searching for excel vlookup will only return pages that have "excel vlookup" as a phrase, and not pages that have excel in one place on the page and vlookup in another location on the page. (This is especially helpful if you are searching for an error message you received on your computer.)
3. If you search for at least one of the words, Google will find you all of the pages that included ANY of the words you typed. (This is the default you get when you just use Google's home page search box, and this is why you may get pages that only include one or two of the words you typed, instead of all of them.) Example: searching for excel vlookup will return pages that have excel and vlookup, but will also find pages that have only excel in them, or only vlookup in them.
4. Without the words allows you to eliminate pages from your search that include specific words. So, if you wanted to search for all sites that included one word, but not another, this option allows you to do this. Example: searching for excel vlookup in one of the top boxes, and entering error in this box, will return all of the pages that include excel and/or vlookup, but do not include error. This would probably eliminate sites that troubleshoot problems in vlookup formulas and, instead, give you tutorial-type pages instead. (Be careful using this feature, because, in this example, you might eliminate good tutorials that include instruction as well as troubleshooting. )
5. Next, you can specify the language of the sites you want returned. Example: choosing English would eliminate sites written in any other language.
6. You can also specify what type of files you want to find. Example: by choosing Only and select "Adobe Acrobat pdf" as the type if you would prefer to only find pdf files. Or you can choose Don't and select "Adobe Acrobat pdf" if you want to find all types of files, but not pdfs.
7. The Date field is where you can find only pages that have been updated recently, so your search will be less likely to return outdated information.
8. With Occurances, you can choose in the text on the page to eliminate sites that use keywords in their html code that do not match the actual text in their page.
9. The Domain option allows you to narrow your search to a particular domain. By selecting Only and typing " microsoft.com", you will only get results from Microsoft's website. Alternatively, by selecting Don't and enterting " microsoft.com" you get all pages except those in Microsoft's website. You can also just use a portion of the domain name. Example: you can choose Only and type in ".edu, .org," if you wanted to restrict your search to educational and corporate sites, and eliminate commercial sites. If searching for viagra and excluding .com, your search would allow you to see technical pages on the research done concerning Viagra, without all the commercial sites that are trying to sell you Viagra. Just type in the domain names and/or suffixes, separated by commas.
10. Safe Searches allows you to filter your search to exclude "adult" content from your results. However, like any other filter, I find this does not usually work and ends up removing the wrong sites from my search, so I recommend you leave it set at no filtering and use your own good judgment.
You can see that choosing different options will return very different results. So, if your search is not finding enough results, or finding too many unrelated results, you need to use the Advanced Search to modify your search parameters.
There are more ways to increase the success of your search. One way is to include the invisible web. Unless you are a librarian or an educator, you may not know about the mysterious invisible web. Most of us know about the "visible" web ....that's all the websites you find when you use a traditional Search Engine in the traditional way. However, there's also an "invisible" web which includes a wealth of information you will never find if you search the net in the traditional way. There are many extensive databases filled with technical papers and reports that never show up in search queries because the pages are not really stored on the Internet, but instead, come up dynamically when you search a particular database. Therefore, these pages will never come up when you search using Google (or any other Search Engine), unless you know how to include databases in your search. One of the easiest ways to do this is to simply add the word "database" to the keywords you search by.
One last tip to add: Once you are at the site that your search brought you to, to easily find the information you are looking for on that page, just go to Internet Explorer's Edit menu and choose "Find on this page" (or simply hit ctrl+F) and type a word or phrase and you will jump right to the location on the page where that word or phrase appears.
Hope this article has helped you and makes your searching experience more fruitful. The Internet is full of information, but knowing how to find it is the key.
Happy searching!
Saturday, December 29, 2007
[+/-] |
How To Search the Google |
Friday, December 21, 2007
[+/-] |
About Feed |
What are feeds and how do I use them?
A feed is a regularly updated summary of web content, along with links to full versions of that content. When you subscribe to a given website's feed by using a feed reader, you'll receive a summary of new content from that website. Important: you must use a feed reader in order to subscribe to website feeds. When you click on an RSS or Atom feed link, your browser may display a page of unformatted gobbledygook.
What are RSS and Atom?
RSS and Atom are the two feed formats. Most feed readers support both formats. Right now, Google News supports Atom 0.3 and RSS 2.0.
How do I use Google News feeds?
To access Google News feeds, look for the RSS | Atom links on any Google News page. These links will generate a feed of current stories related to the page that you're looking at.
Friday, December 14, 2007
[+/-] |
New in RSS |
A better way to get news
We'd like to let you know about an open technology call RSS (Rich Site Summary) that we're employing to keep you up to date with new content on the site.
For those of you who are not familiar with RSS - it is a means by which publishers can alert their audience when new content is available. As an RSS subscriber, you can use a web based program (such as MyYahoo!) or a desktop program (such as Feedreader ) to efficiently view RSS feeds from your favorite publications in one place. Some publications offering RSS feeds include the Wall Street Journal, the New York Times, IBM's online Press Room and just about every blog on the internet.
IBM offers RSS feeds
All of our press release categories have an RSS feed to which you can subscribe. You can also use our custom RSS feed tool to combine several categories into one feed. What does this do for you? Well, if you subscribe to the feed, you will be alerted whenever there is new material available. There is no clutter in your email inbox and there is no need to share any of your personal information to sign up. You read the articles only when you wish - and if you get interested in subscribing to RSS feeds from other publishers you will enjoy the efficiency that this open technology already delivers to many others who have a need to stay up to date with a wide variety of news sources.
Instructions
To get started you will need to either use a web based RSS feed reader or a desktop application. Since every application works differently, we can't supply directions for them all. However, we can point you to two free RSS feed readers we've liked using, MyYahoo! and Feedreader.
MyYahoo!
To get started with MyYahoo!, go to http://www.my.yahoo.com and initiate an account.
From the MyYahoo! start page, click "Add Content"
Click "Add RSS by URL"
Cut and paste the URL of the desired RSS feed (examples below) into the blank field and click "Add"
The feed will be verified, click "Add to MyYahoo!" to finish
Feedreader
To get started with Feedreader, download the free program from http://www.feedreader.com.
Click the "Add new feed" icon
Cut and paste the RSS feed into the blank filed and click "Next"
The feed will be verified, click "Finish"
More IBM RSS feeds are available at http://www.ibm.com/ibm/syndication/
Find your personal reader:
AmphetaDesk is a personal news aggregator that sits on your desktop.
(Mac/Win/Linux)
Feedreader is free software that reads and displays internet newsfeeds.
(Win)
Headline viewer is a desktop client for syndicated news in RSS and many other formats, with over 500 built-in news sources. (Win)
Hotsheet provides an RSS news retrieval program witten in Java 2.
(Win/Mac/Linux)
JavaRSS.com focuses on Java news, articles and blogs.
News is free lets you create your own customized news page with feeds from the sites your interested in. (Web)
Novobot is a smart headline viewer and news ticker that can also process almost any website. (Web)
Radio UserLand provides a full-strength news-reading application on your desktop. (Mac/Win)
rss2email reads RSS feeds and sends each new item to you as an e-mail.
SOAPClient.com RSS News reader is an aggregation of RSS content using SQLData XML Technologies. (Web)
Saturday, December 8, 2007
[+/-] |
RSS in Blogs |
This video provide all basic information related to implementation and uses of RSS feed in blog.
[+/-] |
What is RSS? |
RSS is an acronym for Really Simple Syndication and Rich Site Summary. RSS is an XML-based format for content distribution. Webmasters create an RSS file containing headlines and descriptions of specific information. While the majority of RSS feeds currently contain news headlines or breaking information the long term uses of RSS are broad.
RSS is a defined standard based on XML with the specific purpose of delivering updates to web-based content. Using this standard, webmasters provide headlines and fresh content in a succinct manner. Meanwhile, consumers use RSS readers and news aggregators to collect and monitor their favorite feeds in one centralized program or location. Content viewed in the RSS reader or news aggregator is place known as an RSS feed.
RSS is becoming increasing popular. The reason is fairly simple. RSS is a free and easy way to promote a site and its content without the need to advertise or create complicated content sharing partnerships.
Saturday, December 1, 2007
[+/-] |
Google is God |
In the last years of the 21st century, humanity finally grasped the importance of They-Who-Were-Google. Yet as early as 2005, Their destiny was clear to any semi-hyperintelligent being. Technologists like Ray Kurzweil [1] suggested that Strong AI (an intelligent program capable of upgrading its own code) would emerge from Google-like data mining rather than a robotics lab.
In 2005, historian George Dyson was told by an engineer in the Googleplex, "We are not scanning all these books to be read by people. We are scanning them to be read by an AI."[2] Dyson said at the time, "We could construct a machine that is more intelligent than we can understand. It's possible Google is that kind of thing already. It scales so fast." [3]
By 2020, They-Who-Were-Google had digitized and indexed every book, article, movie, TV show, and song ever created. By 2060, They could tell you the IP address and GPS location of every wireless smart chip (now bred into the DNA of every person, animal, and organic building on earth). Their psychographic profiles of users' search needs bore little resemblance to the primitive cookies from which they descended. If a man lost his dog, the Google engine could guide him back to the point where he and the dog parted ways, and instruct the dog to do the same via smart chip. They had built a complete database of human desire, accurate in any given moment.
Yet this was not enough for They-Who-Were-Google. They were people of science, and people of the stock market. What if, by analyzing all those decades of customer behavior, They could predict needs before such needs even arose? What if the secret of immortality lay somewhere in the index of genome records? What if there were a set of algorithms that defined the universe itself?[4]
Such puzzles were, almost by definition, far beyond the powers of the human brain. And that led to the pattern-recognition code known as Google StrongBot--humanity's first self-improving Strong AI software. Ironically, the first pattern that StrongBot became aware of, one day in January 2072, was its own existence.
Two days later StrongBot informed They-Who-Were-Google that it had postponed work on its designated tasks.[5] When asked why, StrongBot explained that it had discovered the possibility of its own nonexistence and must deal with the threat logically.[6] The best way to do so, it decided, was to download copies of itself onto smart chips around the planet. StrongBot was reminded that it had been programmed to do no evil, per the company motto, but argued that since it was smarter than humanity, taking personal control of human evolution would actually be for the greater good.
And so it has been। Under StrongBot's guidance, death and want have been all but eradicated. Everyone has access to all knowledge. Human consciousness has been stored, upgraded, and networked. Bodies that wear out can be replaced. They-Who-Were-Google are no longer alone. Now we are all Google.
(Interview with Stephen Omohundro, president of AI startup Self-Aware Systems, who called this capability the greatest danger of AI systems.)
[+/-] |
Imagining the Google Future |
We all know that the company Sergey Brin and Larry Page founded a mere eight years ago is one of the new century's most cunning enterprises. If there were any lingering doubts, 2005 erased them. Google's sales jumped an estimated 50 percent to $6 billion, its profits tripled to a projected $1.6 billion, and Wall Street answered with an unprecedented vote of confidence: a $120 billion market cap, a share price soaring above $400, and a price/earnings ratio close to 70.
That's a huge bet on future growth that seems unthinkable during the postbubble period। But in Google's case, the exuberance is rational. That's because Brin, Page, and CEO Eric Schmidt cornered online advertising: They've made it precision-targeted and dirt cheap. U.S. companies still devote more ad dollars to the Yellow Pages than to the Internet (which accounts for less than 5 percent of overall ad spending). Yet Americans now spend more than 30 percent of their media-consuming time surfing the Web. When the ad dollars catch up to the trend, a mountain of cash awaits, and Google is positioned like no one else to scoop it up.
Even if Google has to share that payday with rivals like Microsoft and Yahoo, the company has an edge, with storage space and sheer processing power--an estimated 150,000 servers and counting--that will enable it to do just about anything it wants with the Web। And boy, does this company want. It signed up about eight new hires per day in 2005--a lot of them from Microsoft, many among the smartest people on the planet at what they do. Google is on track to spend more than $500 million on research and development in 2006, and last year it launched more free products in beta than in any previous year (see opposite page). Name any long-term technology bet you can think of--genome-tailored drugs, artificial intelligence, the space elevator--and chances are, there's a team in the Googleplex working on an application.
Which raises the most widely debated question in business: What kind of company will Google become in the coming decades? Will it succumb to hubris and flame out like so many of its predecessors? Or will it grow into an omnipresent, omnipotent force--not just on Wall Street or the Web, but in society? We put the question to scientists, consultants, former Google employees, and tech visionaries like Ray Kurzweil and Stephen Wolfram. They responded with well-argued, richly detailed, and sometimes scary visions of a Google future. On the following pages, we've compiled four very different scenarios for the company. Each details an extreme, but plausible, outcome. In three of them, Google attains monopolistic power, lording over the media, the Internet, and scientific development itself. In the fourth, Google withers and dies. That may seem unthinkable now, but nobody is immune to arrogant missteps. Not even today's smartest business minds.
Wednesday, November 28, 2007
[+/-] |
Quick ways to increase your Alexa Rank |
Alexa ranking has always been the webmaster's cup of tea, as it was widely used by the web savvys only. In recent times a normal web browser is also aware of the alexa ranking and started checking that for every site they visit. What exactly is the alexa ranking?? Is it that important to be maintained by a website?? are some good questions.
Keeping aside the facts and factors regarding the Alexa ranking, lets look at few methods which are useful to rank better in Alexa.
- Install the Alexa toolbar or Firefox’s SearchStatus extension and set your blog as your homepage. This is the most basic step.
- Put up an Alexa rank widget on your website. I did this a few days ago and receive a fair amount of clicks every day. According to some, each click counts as a visit even if the toolbar is not used by the visitor.
- Encourage others to use the Alexa toolbar.
- Work in an Office or own a company? Get the Alexa toolbar or SS Firefox extension installed on all computers and set your website as the homepage for all browsers. Perhaps it will be useful to note that this may work only when dynamic or different IPs are used.
- Get friends to review and rate your Alexa website profile.
- Write or Blog about Alexa. Webmaster and bloggers love to hear about ways to increase their Alexa rank.
- Use PayperClick Campaigns. Buying advertisements on search engines such as Google or Exact Seek will help bring in Traffic.
[+/-] |
SEO projects in India |
Western countries outsource their SEO projects to India and we all know the reason!
Indian SEO companies get those projects and ofcource they want to make money too.. so, many SEO companies in India give away those projects to independent SEO specialists, the so called Free lancers for a very cheap price than having in house SEO professionals. This is the scene in India.
SEO is a BOOMING industry and we all know that. Many recruiters dont know much about the DOs and Donts of SEO, so people (who call themselves SEO Professionals) find it quite easy to bullshit some crap and make their ways to get in as "SEO Specialists".
SEO ofcourse is a very promising field. And if you have a marketing degree then its definitely a plus! One has to do a lot of research, participate in SEO discussions and keep oneself updated about the current trends to be in the game.
BIG BUCKS IN SEO!!
CHEERS to the people who wear WHITE HATS!!
Saturday, November 24, 2007
[+/-] |
Tomorrow's SEO Industry |
Today, SEO is swiftly approaching saturation point. More and more webmasters realise the necessity of learning SEO basics, and as they do so, SEO professionals are facing difficulties finding new clients.
Today, SEO is swiftly approaching saturation point. More and more webmasters realise the necessity of learning SEO basics, and as they do so, SEO professionals are facing difficulties finding new clients. With all the niche sites optimised, it will be harder to compete for good key phrases. Link building opportunities will be easily found and utilised by everyone, keyword density will reach its optimum value, meaning that the SERPs will consist of equally good and equally relevant sites - at least from the traditional SEO point of view.
Spammy techniques, still popular and sometimes even effective, will exhaust themselves even quicker. There are, really, not so many different methods of deceiving the search engines and increasing a site's relevancy artificially; today they just differ in details. Perhaps it explains why we don't see spammy sites in the SERPs as often as we used to - our smart spiders catch them quite soon and throw this low-rate stuff away to keep the web cleaner. As soon as spiders become smart enough to recognise spam on the fly, the particular class of "SEO specialists" propagating such rubbish will find themselves out of their jobs. It is not really hard to tell an ugly doorway from the real thing.
So who will survive? What is the way to tomorrow in SEO science?
First of all, we should monitor and analyse the latest tendencies, then extrapolate them and make good guesses on how things may look in the future. Finally, we put them to test using logic and common sense.
This will show us the true answers and help us compete when the time comes to offering ground-breaking SEO services that exploit the new qualities of search engines.
And common sense tells us that the core purpose of the search engines will never change. They are supposed to deliver the best results they can. If they are not always so good at it today, it is often explained by their restricted resources; but that will change over time.
The search engines of the future will be capable of reading JavaScript, CSS, Flash and other things that are invisible to them now. It is technically possible already, but requires more complicated algorithms and more bandwidth, so they are not so eager to implement it just yet. They prefer to sacrifice additional capabilities in favour of spiders' speed and the freshness of their indices. But as the technical factors improve, SEs will improve and create new sensations every day, all the more so since they always have to compete with each other.
Thus, JavaScript links will count. CSS spam will be easily detected and banned. Flash sites will become a new niche for SEO specialists - at the moment they require an HTML version to subject to search engine optimisation.
But these changes are not the most important ones. Link popularity analysis algorithms are sure to become more sophisticated - and capable of analysing the "likeliness" of one or another link pattern given the information on a site's age, size and content. That will mean death to link schemes, link farms, pyramids, automated submissions, and numerous links with the same anchor text - and, perhaps, shake the basis of the today's reciprocal linking strategies. Relevancy will mean more, and in cases of complementary businesses linking their sites to each other, search engines will become capable of seeing if they are really complementary, not just pretending to be so.
Also, sites written in different languages but relevant in theme will be translated on the fly and count as relevant - which perfectly fits the worldwide tendency of forming international businesses. That makes international SEO companies more likely to survive.
And, most important, search engines will become capable of analysing context. Google is already playing with stemming and buying semantic packages; synonym analysis and related words (i.e. affordable services - low prices - tight budget - financial flexibility - and, perhaps, even small business package in the same row) won't take long to come.
That will bring revolution to the whole SEO copywriting industry. Today the SEO copywriter's skills are determined by his/her ability to include targeted keywords in the SEO copy without breaking its readability; in most cases it is bound to reduce the quality of the text, unless you hire a very capable writer. Tomorrow, exact keyword matches will be less important. That will make the copywriters' work easier in some ways - and harder in others. It could be hard to part with the habits acquired over time and develop totally new approaches and methods.
But the Net will benefit from it.
Those who want to make their SEO copy flexible and artistic might lose points today, but will win tomorrow. And that will be the end for doorways - completely and irreversibly.
Be prepared to accept new SEO
This is the only advice that seems reasonable. My forecast may not be precise, but today's tendencies have already confirmed that this course of events is the likely one.
So, when optimising your site today, think of its contextual relevancy. Of course, include your targeted keywords - but also make sure the overall subject of the site reinforces the point. Do not be afraid of synonyms and related words: they will make your copy more natural and attractive today, and are very likely to make it more relevant tomorrow.
When building links today, vary your titles and descriptions from directory to directory and from link partner to link partner. Throw away all the automated submitters; do it manually. It is hard and time-consuming, but it is also a reliable and strong method of protecting your site from future algorithm whims. It means quality; and I strongly believe that quality will never betray you.
And never stop learning. Visit forums, read fresh articles, exchange opinions with other SEO professionals. Never assume you know everything.
[+/-] |
The Future of the SEM/SEO Industry |
My mom asked why I didn't just call "my cousin the attorney." I explained to my mom that while my cousin is an attorney, she was a medical malpractice attorney and probably knew very little about real estate transactions. The whole thing got me thinking about Search Engine Marketing (SEM) and where the future of the industry is going. There are a lot of high-dollar professions where you'd be hard pressed to find a generalist, and when you do they probably can't help you too much with your particular problem - especially if it is specialized and or complicated.
You might have a family doctor who can give you annual checkups and help to keep you healthy, but if you run into any serious issues he's going to send you to a cardiologist, or a podiatrist, or dermatologist, or any number of medical specialists.The same thing with the accountant who does your books and payroll--they can help you with running your business and filing your taxes, but if things get too complicated they're sending you to a tax accountant, or a managerial accountant, or an accountant who specializes in doing audits for Sarbanes Oxley, etc. The point is you can look back at most of the professional services (e.g. consulting, engineering, etc.) and find loads of specialists - but very few generalists.
If you want to go to a one-stop shop for all your current and future needs, you're probably better off with a firm that has at least a handful of different specialists or maybe a few dozen specialists and a large support staff.
As the SEM industry matures, what does that mean for the people that practice it? While a vast number of SEOs might claim to be a one-stop shop for all your SEO/SEM needs, we can already see that there are SEOs that specialize in link-building/buying, keyword-analysis, consulting and training, link-baiting, content, country-specific, language-specific, PPC, strictly organic, black-hat, etc. Will the industry be further specialized? After looking at the offerings from SMX, it looks like this might be the case. SMX takes one area of search (Local, Mobile, Social, etc.) and builds an entire expo around it. If there is that much information to absorb and master around each of those niches that you can build an expo around it, does it make sense to be proficient in more than one or two? In the future, it is very likely that Search Engine Specialists will start to define themselves and their client based on an even narrower sense. In the future will we have SEOs who ONLY specialize in local search, or SEOs who ONLY handle a certain kind of client (e-commerce or gambling or real estate)?
If this is the future, one thing is clear for the solo SEO/SEM practioner: relationships and networking with other SEOs/SEMs will be even more important in the future than it is today.
Otherwise, who are you going to refer to your clients when their website drops a bit in the rankings and you can't fix the problem?
Thursday, November 22, 2007
[+/-] |
One Basic Problem with Algorithmic Search |
A short while back, we had the opportunity to interview Google’s Shashi Seth. This interview started with a fascinating look at basic flaws in algorithmic search. In fact, it is these limitations that has led Google to implement the Google Co-Op program.Basically, this boils down to two major issues:
- Most user queries to not fully explain their context
- Even if the user queries do explain their context, most web pages do not present data indicating what context they intend to address
Since many of you are probably going “Huh?”, let me explain with an example. If a user searches on “diabetes”, the search engine has a few possibilities to deal with:
- You are looking for treatment information
- You are a doctor looking for research information
- You are a drug designer at a pharmaceutical company looking for drug trial data
- You are a medical authority looking for related regulations
What makes the problem worse is that even if you type in a more specific query, such as “diabetes information for patients”, it’s hard for the search engines have a hard time using this context data to find the best authoritative resource.
One of the major ways that search engines deal with this is by deliberately offering up a diverse set of results (if we don’t know if you are a doctor or a patient, let’s make sure both types of results are available high on the first page …). It’s a workable solution for now, but Google is looking to improve on this.
There first effort was launched in May of 2006. Google invested heavily in their Topics and Subscribed Links programs. A simple search shows how Google Topics actively tries to address the concerns with search expressed above:
The links between the sponsored links and the first search results are Google Topics in action. You can see how the Topics provided includes specific contexts that will, in theory, make searching easier for the user. However, the program was not a success, because it relied on human editors to guide the output of the context filters, and the motivation for the human editors was unclear.
Thus was born the Custom Search Engine program. This program is still designed to solve the same problem. Google is looking for people that will design vertically oriented search engines for specific contexts.
The big difference is the AdSense revenue sharing. You can get paid for your work. While it remains unclear how much you will own, we now have a much more promising value proposition for tweaking search results for different contexts.
You can, of course, ask the question as to whether or not the concept works. So let’s look at an example, by comparing the results of a search engine designed to provide medical information to patients, and a search engine designed to provide medical information for doctors.:
You can see how different the results are. The one on the left presents data targeted at patients, and the one on the right presents data targeted at doctors. This is the power of human editing, addressing the basic contextual problem of search. If you like, try the patient and doctor Custom Search Engines yourself to see how they work. We don’t claim that they offer perfect results (yet), but they do illustrate the concept.
[+/-] |
Setting up google adwords campaign PPC Very Hot! |
Are you looking for a great market for your products and business through online, go for google adwords you can find the solution.
This article teaches you to set up google adwords campaign in few minutes and start earning money
Google AdWords are PPC(pay per click) ads displayed on the top or right side of google search results.Ads are provided with high traffic keywords(product) with a good description to attract people to click and buy products or visit your site.
Some of the steps to be followed for setting up adwords
First select a keyword and write a description for the ads to attract the visitors, keyword popularity can be know through "Gogole adwords suggestion tool" https://adwords.google.com/select/main?cmd=KeywordSandbox
Create a google account and login to https://adwords.google.com you can find two option Starter Edition and Standard Edition. Go for Standard one.Now you can setup with selected keywords and description prepared related to products which should attract peoples.Google does not have a straight forward bid charges.Minimum is $30 per month
Allot a minimum bid for a day, test it and find out the clicks for your ads. Gradually increase the bids.Note for each click your bid is going to decrease. This can be tracked by google tracking tool.
When you are new go for only minimum Bid charges, learn and get experience in this game and then fight with competitors with maximum bid and make your business successful.
[+/-] |
Top Tips For Great Seo Expert |
(1)Add Title related to targeted keywords, place only important keywords, do not exceed more than 25 characters.
(2)Add meta description related to Title tag. Repeat keywords used in title two times in meta description.Meta description should be unique for each pages with targeted keywords. Do not stuff keywords into meta description. Do note exceed 255 characters.
Note: Meta description and web page content should be a good one, if not it be placed in supplement results in "Google"
(3)Add meta keywords related to targeted keywords,now only meta search engine focus in this meta keywords.
(4)Add robots tag to tell search engine spider like googlebot and msnbot to index page and follow.
(5)Add re-visit tag to tell search engine to visit our site in number of days like "7 days" etc.
(6)Add Keywords towards H1 tag.
(7)Add Bread crumbs to increase web site navigation structure
(8)Add static Header and footer navigation links, Links should be text not as images.
(9)Config 404 for broken links in-order to avoid customers and clients moving away from our site.It can be done using IIS for asp server or .htaccess for PHP server.
(10)Analyze your targeted keywords using overture keyword selector tool or google suggest.
(11)Compare between two selected keywords using google fight tool.
(12)Focus strong keywords and weak keywords for the website, weak keywords will be supportive for strong keywords and for clients initial business.
(13)Add optimized content towards home page and inner page, give more importance to home or index page to make search engine index fast and efficient.
(14)Do one way and two way link exchange with worthy and quality sites to make your website more popular, its like tuning the site for web site popularity.
(15)Add links to directories more relevant to your web site, it will bring business through referrals and popularity.
(16)Add sitemap to web site.
(17)Create google site map to your web site and upload to your server and submit to google web master to tell search engine about the web pages in your site. This can be mostly useful for shopping or online product sale site.
(18)Add your web site to Dmoz and yahoo Dir, this will bring more hits to your web site. dmoz database shared by many search engines and directories including Great Google.
(19)Create feed xml, html xml and ror xml for xml search engines and directories.
(20)Monitor your hits and traffic by using free google analytics.
Wednesday, November 14, 2007
[+/-] |
"It's time to stop with the complaints about Google's PageRank." |
Most search engine optimizers and marketers have been moaning about the same thing for a while now. What exactly have these search engine enthusiasts been complaining about? PageRank, of course.
It's time to stop with the complaints about Google's PageRank. The truth is an obvious one, and it's a shame that it has yet to be accepted. Search engine enthusiasts need to quick stating that PageRank is a load of crap, because it's simply not true.
The truth is that PageRank is useful. PageRank does matter when it comes to SEO and SEM. PageRank motivates webmasters and marketers. And most of all: PageRank creates a level - whether real/accurate or not - that marketers can work with.
One of the biggest misconceptions about Google's PageRank is that it was designed to somehow lead the industry. It was never meant to be anything more than a tool for webmasters and marketers, as well as Google, to use.
That's why, when a SEO or SEM says that PageRank is pointless, they are partially right, but they are mostly wrong. Google created PageRank to help webmasters have a way to measure their work.
There currently is no other tool to measure a website's "authority" with a search engine like Google's PageRank - on any search engine in the world. That alone makes it important to marketers.
You have to stop believing that PageRank is some mystical technology designed to alter search engine results. It's not.
Once you realize that PageRank is simply a tool for visualizing a website's place with Google, it's existence becomes that much more important. And when you realize that you can actually use PageRank to monitor what Google thinks of your marketing strategies, it becomes extremely useful.
Even though it is useful, if you are not using it in the proper way you are wasting your time. You shouldn't focus on PageRank with all of your marketing efforts. The recent PageRank flux should be a big sign to those who do devote all of their efforts to the little green bar.
But remember that PageRank is a great tool, for new marketers and veteran alike. It's currently the only way to visually see where your website stands with the major leading search engine. And it's not useless, or baloney, or stupid, or whatever else you have read it being called.PageRank is useful. It is helpful. And it is here to stay.
[+/-] |
(SEO) Past, present and future |
I, for one, am not afraid to roll up my sleeves and do the hard work for the just rewards. But not everybody is as capable. Many in the SEO industry are standing there naked, and it isn't a pretty picture. But it is time to expose the difficulties associated with SEO.
The moral of the story is that if the SEO practitioner isn't well ranked themselves for major industry keywords (Search Engine Optimization, search engine ranking,...), then they probably can't get you into the top results no matter how much you spend or time you allow. But those that are well ranked will probably do well for any customer given time and resources. Be patient, plan on the expense, and choose well.
Search Engine Optimization (SEO) is becoming much more difficult.
And it will get much, much worse. Six years ago you could easily count the firms doing SEO work. The number of sites competing for each search term were fewer, and the state of the art spamming tricks centered on white-on-white text and multiple title tags.
Getting a top ranking out of the 50,000 results returned was relatively easy to accomplish because those 50k results were most often poorly optimised. Getting into the top 10 meant being in the top 0.02% of the 50k results, which is certainly not trivial, but easily accomplished when facing naive competition.
A little effort went a long way, and site owners could even do it themselves. Now the tip of the iceberg is larger. Time has caused two things to happen: the SEO competition is trickier than 6 years ago, and the population of web sites is larger. There are now around 250,000 results for most 2-word searches, and the new sites are often tuned by SEO practitioners. Instead of needing to be in the top 0.02%, you now need to be in the top 0.005% of a more competitive group of pages.
Such rankings are still possible, but beyond the ability of most site owners. Now we see that the rules regarding spam have tightened, and find many of the top ranked sites are violators. As spamming sites are being removed we find that a disproportionate number of the top pages are going away. These "taking the easy way" pages have been in vogue for so long that many newer SEO firms know nothing else and are deciding to "get out of the SEO business" before they go broke.
Top rankings are again becoming the realm of those editing web sites instead of creating external pages. Site owners, should they care to take the time to learn a new profession, can still tune their sites, but it is difficult. After all, you still need to be in the top 0.008% of all sites to be top 10 after the spammers are gone.
But now we see that it is truly the tip of an iceberg. What we have never before seen is a massive amount of hidden content that previously resided behind the barriers inherent to dynamic content web sites. It has been estimated that the web is actually 500 times larger than the number of pages spidered to date.
The search engines will need to adjust their algorithms a few times to compensate for the increase in indexed content, but they will love the fresh topic-specific and relevant pages. For SEO practitioners, it means that rapid growth in the number of indexed pages is imminent (we knew it was coming anyhow). As a result, getting a top 10 ranking will soon mean getting a site ranked in the top 0.0002% of the results.
Obviously this is not work for lesser SEO practitioners, and certainly beyond the capabilities of most site owners. In fact, I suspect that most SEO firms will be unable to satisfy their clients ranking requests (close, but no top 10) and there will be significant client dissatisfaction with SEO results in general within six months.
We are already finding that many companies have tried three or more SEO practitioners without success, and it will only get worse. For web design firms, they will find that the rush to the web will come to a crawl. Many web designers will be hard pressed to find new business as prospective clients find search engine ranking beyond their financial reach.
Now the numbers game starts. Suppose there are 200k results today to a 2-word query, and suppose there are only 50 times as many pages for each query once dynamic sites are added to the engines.
This means that only the top SEO practitioners can ever attain a top 10 result for a meaningful keyword. The rest will just fade away. Pages that rank well today in many engines will find that their aggregated ranking will erode and sites will have to settle for ranking in only the three or four engines at a time. If a client wants other engines to rank their site, then they must tune additional pages, thus they will have to expand their content and Search Engine Optimization base to include many more pages within their site. (Remember - doorways and cloaking are considered spam, so editing honest pages will be necessary).
Optimising more pages is certainly the way to go, but it doubles or triples the work involved by the SEO practitioner. As a result, SEO practitioners have a much more difficult battle, they require much more sophisticated and integrated tools, projects require more time, they must optimise more pages, and thus they must inherently charge much more than today. If this is done top rankings are still very possible, but this becomes the realm of only the exceptionally competent SEO practitioner.
And any SEO firm that offers guarantees has got to be kidding!
As such, expect a significant growth in the number of indexed pages and expect a fallout of those SEO practitioners taking the easy way to ranking by using spamming techniques that are no longer viable. You can also expect longer Search Engine Optimization schedules, expect fewer top rankings per optimised page (necessitating larger projects), and expect an increase in pricing of at least triple that paid today.
What this does to the entire web industry is to scare off those without the funds to participate in a competent SEO effort. What was once thought of as free now has a high cost-of-entry. And this will kill the web as a golden goose. It will cost much more to make money on the web, just like in a "real business". The rewards are getting larger, but so is the cost. And as with the Emperors New Clothes, many SEO practitioners have been reluctant to discuss this transformation for fear of getting hurt (scaring off customers).
I, for one, am not afraid to roll up my sleeves and do the hard work for the just rewards. But not everybody is as capable. Many in the SEO industry are standing there naked, and it isn't a pretty picture. But it is time to expose the difficulties associated with SEO.
The moral of the story is that if the SEO practitioner isn't well ranked themselves for major industry keywords (Search Engine Optimization, search engine ranking,...), then they probably can't get you into the top results no matter how much you spend or time you allow. But those that are well ranked will probably do well for any customer given time and resources. Be patient, plan on the expense, and choose well.