Sign in to follow this
Followers
0
About This Club
Have any questions about Web Development? Ask here!
- What's new in this club
-
In robots.txt: # Alexa: https://support.alexa.com/hc/en-us/articles/200450194-Alexa-s-Web-and-Site-Audit-Crawlers User-agent: ia_archiver Disallow: / # Ahrefs: https://ahrefs.com/robot User-agent: AhrefsBot Disallow: / # MOZ: https://moz.com/help/moz-procedures/crawlers/rogerbot User-agent: rogerbot Disallow: / # MOZ: https://moz.com/help/moz-procedures/crawlers/dotbot User-agent: dotbot Disallow: / # Semrush: https://www.semrush.com/bot/ User-agent: SemrushBot Disallow: / User-agent: SiteAuditBot Disallow: / User-agent: SemrushBot-BA Disallow: / User-agent: SemrushBot-SI Disallow: / User-agent: SemrushBot-SWA Disallow: / User-agent: SemrushBot-CT Disallow: / User-agent: SemrushBot-BM Disallow: / User-agent: SplitSignalBot Disallow: / # Majestic: https://mj12bot.com/ User-agent: MJ12bot Disallow: / # SerpStat: https://serpstatbot.com/ User-agent: serpstatbot Disallow: / # MegaIndex: https://ru.megaindex.com/blog/seo-bot-detection User-agent: MegaIndexBot Disallow: / # SEO-PowerSuite-bot: https://www.link-assistant.com/seo-workflow/site-audit.html User-agent: SEO-PowerSuite-bot Disallow: / User-agent: * Disallow: Please note: comments are provided for convenience, but some robots do not handle robots.txt files with comments correctly. Therefore, it is advisable to remove them for use in production.
-
You should use the "Require" rule instead of "order/allow/deny" directives. They are deprecated and will be removed in future version of apache server. E.g.: 1. When you need to deny access to specific files, then: Instead of this: <FilesMatch "(xmlrpc\.php|wp-trackback\.php)"> Order Allow,Deny Deny from all </FilesMatch> Use this: <FilesMatch "(xmlrpc\.php|wp-trackback\.php)"> Require all denied </FilesMatch> 2. When you need to allow access only from specific IP or range, use this: Require ip 103.21.244.0 # allow connections from specified IPv4 address Require ip 103.21.244.0/22 # allow connections from specified IPv4 range Require ip 2400:cb00::/32 # allow connections from specified IPv6 range # Allow connections only from CloudFlare (see https://www.cloudflare.com/ips/) Require ip 103.21.244.0/22 Require ip 103.22.200.0/22 Require ip 103.31.4.0/22 Require ip 104.16.0.0/13 Require ip 104.24.0.0/14 Require ip 108.162.192.0/18 Require ip 131.0.72.0/22 Require ip 141.101.64.0/18 Require ip 162.158.0.0/15 Require ip 172.64.0.0/13 Require ip 173.245.48.0/20 Require ip 188.114.96.0/20 Require ip 190.93.240.0/20 Require ip 197.234.240.0/22 Require ip 198.41.128.0/17 Require ip 2400:cb00::/32 Require ip 2606:4700::/32 Require ip 2803:f800::/32 Require ip 2405:b500::/32 Require ip 2405:8100::/32 Require ip 2a06:98c0::/29 Require ip 2c0f:f248::/32
-
As a developer, I’ll often hear about “slow page load”. Of course, I’m not the only one in this boat. We’ve grown used to visiting sites that perform phenomenally, which adds to how unbearable the experience feels with ones that don’t. There could be multiple reasons behind a slow loading website. One issue that Magento store owners especially tend to face is down to the hundreds of images on their web page. I personally like the Magento 2 lazy load extension with its minimal but excellent features. Should I give it a try.
-
Is there any good solution for fixing the Cumulative Layout Shift (CLS) caused by Google AdSense? How do you deal with it?
- 1 reply
-
- 1
-
-
I spent some time finding this solution, so I decided to share it with you. There is no sleep() or delay() functions in JS but you can easily implement it on your own using promises in combo with setTimeout(): function sleep(ms) { return new Promise(resolve => setTimeout(resolve, ms)); } Usage: await sleep(2000); // Sleep for 2 seconds or sleep(2000).then(() => { // Sleep for 2 seconds and then run the code from this block });
-
jQuery and Axios are remnants of the past. Fetch API is better: Native support in all modern browsers Better stability and performance No need to download libraries what speeds up page loading and saves traffic Compatible with older browsers (Like IE 10 and 11) thanks to lightweight fetch polyfill and promise polyfill
-
This question is quite frequent in a last few days, so here is the instruction. Simplest way to do this on Apache web server is: Add this code into your .htaccess file (must be placed in root directory of your website): <IfModule mod_headers.c> Header set Strict-Transport-Security "max-age=604800; includeSubDomains; preload" env=HTTPS </IfModule> Check if everything OK on your website and subdomains. WARNING: ALL subdomains MUST be accessible by HTTPS if you want to include your domain into preload list. Adding to preload list will make your domain and all its subdomains inacessible by HTTP protocol in all modern browsers. Your users will be forced to use HTTPS only. If everything is OK and you're 100% sure what you're ready to enable HSTS for longer period and be added to preload list, increase you max-age value to 63072000 and add "preload" directive. Your code in .htaccess should look like this: <IfModule mod_headers.c> Header set Strict-Transport-Security "max-age=63072000; includeSubDomains; preload" env=HTTPS </IfModule> Finally add your website to the preload list on https://hstspreload.org/