The basics of Search Engine

A search aggregator typically allows users to select specific search engines ad-hoc to perform a specified query. At the time the user enters the query into the Search Aggregator, it generates the required URL “on the fly” by inserting the search query into the parameterized URL for the search feed. A parameterized URL looks something like this:

http://news.google.com/news?hl=en&ned=us&q={SEARCH_TERMS}&ie=UTF-8&output=rss

In this case, the {SEARCH_TERMS} parameter would be replaced with the user requested search terms, and the query would be sent to the host. The Search Aggregator would then parse the results and display them in a user-friendly way.
This system has several advantages over traditional metasearch engines. Primarily, it allows the user greater flexibility in deciding which engines should be used to perform the query. They also allow for easy addition of new engines to the users personal collection (similar to the way a user adds a new news feed to a news aggregator.)
Aggregator
Metasearch engine
Federated search

Search engine software  http://en.wikipedia.org/wiki/Category:Search_engine_software

The designs of my Intelligent 3D search engine will take into account the file type using a specific keyword, and will spun hundreds of searches to bring back a co-ordinated database of links to be placed on a typical 3D page, with the ability to re-organise the results using filters by asking “yes” or “no” to bring an accuracy so precise, whatever that is found in the zones will be the most accurate answers. So it is possible to move the results between 3D pages, and also zoom in or zoom out of answers. This design can also be done in 2D. But to achieve my aim, I need hundreds of web-bots to index the entire Internet, redesign MetaData and all the different file types and a cache so huge to save the results of searches, my intelligent software will give intelligence to the use of english for programming, where a keyword or sentence can be easily understood by the computer. I use web-bot technologies to parse info, making sense of keywords, sentences, paragraphs and filetypes into machine language, then I can easily link, compare and manipulate the data, that is why I need a quantum computer, but the Desktop can be the next generation CPU, need not be as fast as a quantum computer, with the ability to handle huge data of information. Based on all my insights, it is then possible to re-engineer the next Desktop CPU in 5 years time, to be ready for a totally new experience with 3D search engine and intelligent software. Sorry but quantum computers will not take over the world yet, heeheehee.

Contributed by Oogle. 

 

Author: Gilbert Tan TS

IT expert with more than 20 years experience in Multiple OS, Security, Data & Internet , Interests include AI and Big Data, Internet and multimedia. An experienced Real Estate agent, Insurance agent, and a Futures trader. I am capable of finding any answers in the world you want as long as there are reports available online for me to do my own research to bring you closest to all the unsolved mysteries in this world, because I can find all the paths to the Truth, and what the Future holds. All I need is to observe, test and probe to research on anything I want, what you need to do will take months to achieve, all I need is a few hours.​

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s