Imagini descriptive – cuvintele nu mai sînt necesare

Imaginile astea fac parte din categoria în care cuvintele nu mai sînt necesare… Sau no word necessary, dacă ar fi să mă iau după expresia englezească din titlul mesajului în care le-am primit… Bucuraţi-vă de ele… şi aduceţi mulţumiri celor care le-au realizat (da’ pe care să fiu ai naibii dacă-i ştiu).

O parte dintre ele au circulat şi recirculat pe facebook sau pe alte reţele sociale (pînă şi eu le recunosc), aşa că găsirea autorilor lor ar fi asemănător săpăturilor arheologice. 😀

No words necessary imagini descriptive ATT00022 No words necessary imagini descriptive ATT00025 No words necessary imagini descriptive ATT00028 No words necessary imagini descriptive ATT00031 No words necessary imagini descriptive ATT00034 No words necessary imagini descriptive ATT00037 No words necessary imagini descriptive ATT00040 No words necessary imagini descriptive ATT00043 No words necessary imagini descriptive ATT00046 No words necessary imagini descriptive ATT00049 No words necessary imagini descriptive ATT00052 No words necessary imagini descriptive ATT00055 No words necessary imagini descriptive ATT00058 No words necessary imagini descriptive ATT00061 No words necessary imagini descriptive ATT00064 No words necessary imagini descriptive ATT00067 No words necessary imagini descriptive ATT00070 No words necessary imagini descriptive ATT00073 No words necessary imagini descriptive ATT00076 No words necessary imagini descriptive ATT00079 No words necessary imagini descriptive ATT00082 No words necessary imagini descriptive ATT00085 No words necessary imagini descriptive ATT00088 No words necessary imagini descriptive ATT00106 No words necessary imagini descriptive ATT00109 No words necessary imagini descriptive ATT00112 No words necessary imagini descriptive ATT00115 No words necessary imagini descriptive ATT00118

Computers Why the Turing test is a poor gauge for todays artificial intelligences /Dan-Marius.ro – my slice of internet / Oradea, Bihor, Romania

Computers Why the Turing test is a poor gauge for todays artificial intelligences /Dan-Marius.ro – my slice of internet / Oradea, Bihor, Romania.

You might have noticed the huge progess of the computer processing power and the development of the programming algorithms in the laste decades, especially if you have lived them. This evolution of technology made possible the appearance of the Artificial Intelligence, pieces of complex software that can interact with their environment, learn from its changes and their own actions. In one word: they adapt. Some examples are the supercomputers that are masters in chess, walking computers like Asimov from Honda (perhaps the most famous one), rolling robots that have a lof of tasks to acomplish in labs with only one target in mind: further development of their algorithms and their inteligence.

As the definition says, the Turing test is a test of machine’s ability to exhibit intelligent behaviour, a criterion proposed by Alan Turing for deciding whethever a computer is intelligent. Even if it can be considered that if a computer acts, reacts and interacts like a sentient being that it is sentient, the Turing test also states that to be considered sentient an Artificial Intelligence must acheive a 30% succes rate, that means it must convince a human that he is talking to another huma in 30% of all the trials. So, if a computer’s conversation is indistiguishable from a human’s conversation and it can fool a human for some time than the machine and its Artificial Intelligence can be credited with some intelligence.

SEO and Webdesign: SEO friendly URLs

In the very beginning of this article I must tell you that an URL comes from Uniform Resource Locator (the initials) that is a Uniform Resource Identifier (URI) that specifies where an identified resource is available and the mechanism for retrieving it. URI is a string of characters used to identify a resource or a name on the internet. Such identification enables interaction with the representations of the resource over a network using specific protocols.

It is said that in popular usage and in many technical documents and verbal discussions the URL is often incorrectly used as a synonym for URI. I’ve also met a lot of people that are using the term and they don’t know what’s coming from.

The way you create the URLs inside you page is one of the most important methods to improve the search engine optimization. Friendly URLs means they should be readable by the humans, but this counts for the search engines as well.

There are two things that are important: the first one is the human point of view. How many of you will access an URL that looks like http://www.webdesign-software-code-seo.com/8548-954page.html and how many will access http://www.webdesign-software-code-seo.com/freelancers-wanted/ or http://www.webdesign-software-code-seo.com/freelancer-jobs/? The tendencies are overwhelming for the second type of URLs because they contains words that have meaning for us and that gives clues about what the page contains.

That’s why it is better to use words in the URLs even for the search engines (the second point of view), especially if they are keywords (I mean the words that are the most relevant inside the content and the subject of the article, that the users may search on the internet). The best way to make a friendly URL is to create it based on the title of your article (and the date if this helps to identify faster the URL you are searching on a big site) and the underline the keywords using the heading tags. The search engines see the all this elements and it helps to index the page easier and to improve it’s ranking.

The friendly URLs are always static URLs. There are a lot of sites that generate URLs like this: http://www.some-site.com/index.php?category=456&subcategory=12&article=44574 and that’s called dynamic URL. When spaces, apostrophes and other special characters (like %e2%80%93) appears in this dynamic URLs it’s even worse: when you try to put such link on the Facebook or StumbleUpon (or any social network) there are a lot of chances that the URL will appear broken.

From the search engines point of view the static URL will be indexed much easier than the dynamic form, and there is no confusion and no missing parts from what could be important inside the URL.

Also, one of the most important things is to keep it as shorter and as descriptive as you can (clean and simple). The shorter the URL is the more successful it will be, as for web rankings and for the visitors’ use (this includes people copying the URL for link purposes). Also the twitter URL posting (for sharing information as well for backlinking) have became a habit in the last few years and a longer URL can not be posted there. More, if a URL string is long the weight and relevance of each word inside is diluted. If your URLs are keyword-rich, including too many other words or phrases means that the importance of the keyword is lost amidst all the other words.

For example consider the following URLs:

www.readmybooks.com/horror-science-fiction-temporal-travel-book.htm
www.readmybooks.com/store/books/science-fiction/horror-temporal-travel/book6888767.htm

The first URL is succinct, has the relevant keywords and no surplus syntax to dilute the importance of these words, and is easy for somebody to read, copy, and paste. The second URL is more complicated and the relevance of the keywords are reduced.

URLs can be constructed using upper and lowercase or a mixture of both, and the servers make the difference between them. Mixing uppercase and lowercase (even when following standard grammar rules such as using a capital letter for a name) can make your site structure unnecessarily complicated. You also run the risk of losing visitors who forget to use the required capital letter through the URL and then can’t access the page. The most common way to post the URLs across the internet is to post them lowercase. If you are redeveloping your URLs and come across pages using uppercase you should create a permanent 301 redirect to a lowercase version to avoid confusion.

There are a few tools and methods used for the URL rewriting in order to have the friendly URLs, one of the most common is using the .htaccess file (this is the special file that sets up the deal for you, it can contain all sorts of directives for the Apache server. If you’re not using an Apache-based server, you’ll have to read your server’s manual on how to do it) put in the root directory of the site. It works perfectly with the Apache server and php (for example), so it works fine with all the php-based systems. I will give some examples how to work with .htaccess and the other tools later.

Finally, it is very important when developing and maintaining a site is that you must keep the same structure of the URL. Do not change the rules to generate the URLs after the site is online, especially if they are created dynamically for the entire site. Once one URL is online it must be the same as long as possible (until the site drops dead or the internet ends). The search engines do not like the sites whose URLs changes from one month to another (for example), so try to change (even correct when necessary) them as little as possible. Also try to keep the URLs the same on the entire site, not depending by it’s sections, this will make future development much easier as there will be a standard convention to follow… if your URL structure is globally used the visitors will also find it much easier to understand how the information is organized and stored and they will find what they are searching faster.

 

 

The source is here.