IoT 4: Get Your Presence in Internet (SEO Today)

Last Updated on September 20, 2020

In my observance, the "day" seems to be upcoming. So, I suggest that you obtain skills to get your presence without any help from major services. How can I be going to treat the "day"? Who will become a taker or a payer? At least, we should cheer with coffee on the "day".


* So, I don't like to prefix any character to "day". Any alphabetical characters, a-z, should not have meaning only by itself. It causes the death of the literature. For example, "D comes", I can image that a duck in a pond welcomes. Usually, the person who has the switch is out of your imagination.


OK, the first thing to get your presence is to request your website pages to be indexed in search engines. The easiest way is to join into webmaster tools of search engines and submit requests to crawl your websites by their robots. I'm joining with two engines.


Google Search Console

Bing (Microsoft) Webmaster Tools


So, DuckDuckGo doesn't have webmaster tools to submit your page. However, the result of the Dax is sourced from Bing, etc. Russian Yandex has webmaster tools, but I've not tested as of today. Note that Chinese Baidu seems to have webmaster tools, but all in simplified Chinese language. When you analyze the log of your website, you can see visiting of several unknown crawlers. These are SEO and Internet marketing tools in the commercial world, giving us "strong" power to promote some operational action in our business. These also give us one aspect of the knowledge to the cost-effectiveness of Internet marketing. So-called "old" media don't have such a pronounced tool, and the lack causes several difficulties in their business. I think DuckDuckGo is the best search engine on today in view of its search result. Plus, it could be customized its page through parameters.


* Several crawlers seem to be made from Python, i.e., home builders can make robots to crawl. As Internet marketing, crawlers are trick makers because their watching is mystery in terms of unknown visitors who have watched our contents that are stored in crawlers' data tank. So, to get the correct marketing, restricting "persona non grata" crawlers is needed.


The second thing is to make sitemap.xml and submit it to search engines.


# On Terminal of Debian GNU/Linux 10

curl http://www.google.com/ping?sitemap=https%3A%2F%2Fsubdomain.example.com/sitemap.xml

curl http://www.bing.com/ping?sitemap=https%3A%2F%2Fsubdomain.example.com/sitemap.xml


As for the trend of the search engine optimization (SEO) today, the first one is the best because crawling seems to occur ASAP. However, the second one will not set quick crawling. Nowadays, submitting requests to crawl pages is automated by APIs, e.g., Google Search Console APIs. That means, robots of search engines are in very business. In my experience so far, their business suggests me to submit all of pages because crawling doesn't dive to the further levels of my pages. Even if I have submitted sitemaps, the crawling of robots seems not to get diversity.


By the way, you can remember blog searches. However, they are going to end their services. Has blog ended? Hmm...

Please free to contact me by E-mail if you have any opinion or comment on this site.
Link: E-mail Adress
Push the "Search!" button to show results. How to search for a word?