Comments (19)
Yeah, so make sure there isn't something like this in the site file for Nginx:
location = /robots.txt { access_log off; log_not_found off; }
That would cause Craft to never see the request.
from seomatic.
What web server are you using? Make sure there aren't any rules that might not give Craft a chance to see the request.
from seomatic.
Using Digital Ocean with Ubuntu and Server Pilot - I believe it's running
Apache/Nginx with PHP7
On Wed, Mar 2, 2016 at 12:06 PM khalwat [email protected] wrote:
What web server are you using? Make sure there aren't any rules that might
not give Craft a chance to see the request.—
Reply to this email directly or view it on GitHub
#32 (comment)
.
from seomatic.
Abbas, let me know what you uncover. I strongly suspect that this is server configuration issue, but potentially, it could also be a misunderstanding.
The webpage you pointed me to is http://ww2.tbreak.com/companies/apple -- robots.txt should only appear at the root, so http://ww2.tbreak.com/robots.txt assuming that's the domain we're talking about here?
from seomatic.
I just ran the tool you mentioned on my site (which uses SEOmatic) and it found the robots.txt
just fine; I wonder if it isn't some kind of server configuration, as mentioned above?
from seomatic.
Thanks for the follow up
Yes, the robots.txt should appear in the root but it's not. Checked with the server guys and thus was their response:
ServerPilot does not block search engine robots by default in our configuration. You should not need to make any changes to Nginx.
If the plugin is not creating the robots.txt file I would recommend verifying the files permissions of your various app files, which could prevent the plugin from creating the necessary files.
Sent from my iPad
On Mar 3, 2016, at 7:46 AM, khalwat [email protected] wrote:
Abbas, let me know what you uncover. I strongly suspect that this is server configuration issue, but potentially, it could also be a misunderstanding.
The webpage you pointed me to is http://ww2.tbreak.com/companies/apple -- robots.txt should only appear at the root, so http://ww2.tbreak.com/robots.txt assuming that's the domain we're talking about here?
—
Reply to this email directly or view it on GitHub.
from seomatic.
Well, the plugin doesn't create a robots.txt
file -- what it does is it creates a route in Craft such that if there's an incoming request in Craft for robots.txt
, SEOmatic handles it by routing it to a controller that renders the robots.txt
as a template.
So I suspect that that's happening here is for some reason, Craft is never seeing the request. What about humans.txt
? It looks like that IS showing up on your domain:
http://ww2.tbreak.com/humans.txt
...which is handled exactly the same way robots.txt
is, so I really strongly believe this is a server setup issue. Have them check to see if the .conf
file for your ServerPilot setup has a directive like this in it:
location = /robots.txt { access_log off; log_not_found off; }
...which I think is in there by default on Nginx. It needs to be commented out or removed for it to work.
from seomatic.
Oh, by the way, you mentioned sitemap as well -- SEOmatic doesn't create a sitemap. There are a number of plugins that will do that for you, however.
It's on my "wish list" of things to implement, however.
from seomatic.
Would you mind taking a quick look at how the plugin is setup in Craft
before I approach the server guys again? I can pass you the CP login details
On Thu, Mar 3, 2016 at 8:08 AM khalwat [email protected] wrote:
Oh, by the way, you mentioned sitemap as well -- SEOmatic doesn't create a
sitemap. There are a number of plugins that will do that for you, however.It's on my "wish list" of things to implement, however.
—
Reply to this email directly or view it on GitHub
#32 (comment)
.
from seomatic.
I wouldn't mind at all, but I don't think it's going to help... there really are not any settings for Robots.txt in SEOmatic. If humans.txt
is working on your site (which it is) and robots.txt
is not, then it is definitely a server configuration issue mentioned above. The exact same mechanism is used for both.
from seomatic.
^ You might want to delete or edit that comment, it's publicly visible.
Anyway, I logged in and fixed your SEO Keywords, but I don't see anything that's configured wrong. I just tested it, and it looks like your robots.txt is working right:
http://ww2.tbreak.com/robots.txt
kotak:~ andrew$ curl -I http://ww2.tbreak.com/robots.txt
HTTP/1.1 302 Found
Server: nginx
Date: Thu, 03 Mar 2016 20:16:39 GMT
Content-Type: text/html; charset=UTF-8
Connection: keep-alive
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate
Pragma: no-cache
X-Powered-By: Craft CMS
Set-Cookie: CraftSessionId=739346e89c0056d2d56400716c3869e3; path=/; HttpOnly
Set-Cookie: 319e541db533da347dd78ecfd2c16496RatingHistory=5b231bf4125703e620453cbf076a44e08346046ds%3A48%3A%221c7e61a82810504c62c0aafe23ca7e48fc082507YTowOnt9%22%3B; expires=Wed, 04-Mar-2026 06:24:19 GMT; Max-Age=315569260; path=/; HttpOnly
Location: http://ww2.tbreak.com/login
from seomatic.
Weird- site-analyzer.com is still not finding it. Even I'm getting a 404
when trying to load http://ww2.tbreak.com/robots.txt
On Fri, Mar 4, 2016 at 12:17 AM khalwat [email protected] wrote:
^ You might want to delete or edit that comment, it's publicly visible.
Anyway, I logged in and fixed your SEO Keywords, but I don't see anything
that's configured wrong. I just tested it, and it looks like your
robots.txt is working right:http://ww2.tbreak.com/robots.txt
kotak:~ andrew$ curl -I http://ww2.tbreak.com/robots.txt
HTTP/1.1 302 Found
Server: nginx
Date: Thu, 03 Mar 2016 20:16:39 GMT
Content-Type: text/html; charset=UTF-8
Connection: keep-alive
Expires: Thu, 19 Nov 1981 08:52:00 GMT
Cache-Control: no-store, no-cache, must-revalidate
Pragma: no-cache
X-Powered-By: Craft CMS
Set-Cookie: CraftSessionId=739346e89c0056d2d56400716c3869e3; path=/; HttpOnly
Set-Cookie: 319e541db533da347dd78ecfd2c16496RatingHistory=5b231bf4125703e620453cbf076a44e08346046ds%3A48%3A%221c7e61a82810504c62c0aafe23ca7e48fc082507YTowOnt9%22%3B; expires=Wed, 04-Mar-2026 06:24:19 GMT; Max-Age=315569260; path=/; HttpOnly
Location: http://ww2.tbreak.com/login—
Reply to this email directly or view it on GitHub
#32 (comment)
.
from seomatic.
aahhhhhh, I know what it is :)
You're using an old version of SEOmatic that had a bug in it where robots.txt
would not display unless you were logged in.
Update to the latest (1.1.6 was just released today) and it'll be fixed. Sorry for the run-around, I completely forgot. I should have checked the version of SEOmatic you were running.
from seomatic.
BAM- upgraded and now it seems to work.
BTW, do you have any plans to add Microdata or redirections to your plugin.
I think both of them could come in handy.
On Fri, Mar 4, 2016 at 12:31 AM khalwat [email protected] wrote:
aahhhhhh, I know what it is :)
You're using an old version of SEOmatic that had a bug in it where
robots.txt would not display unless you were logged in.Update to the latest (1.1.6 was just released today) and it'll be fixed.
Sorry for the run-around, I completely forgot. I should have checked the
version of SEOmatic you were running.—
Reply to this email directly or view it on GitHub
#32 (comment)
.
from seomatic.
There's already a TON of micro data in the plugin -- what specifically do you mean?
As for redirects, I thought about it, but there are some very good (free) plugins out there already that do it well. What would you use it for?
from seomatic.
Maybe I haven't set it up correctly but for example, here is a review with
a final rating but Google is showing that rating in rich snipets
http://ww2.tbreak.com/reviews/sennheiser-momentum-2.0 or here is a product
page which is again missing microdata
http://ww2.tbreak.com/reviews/samsung-galaxy-s7
With regards to redirects, we're shifting from Wordpress to Craft and some
posts are getting their URL structure changed. Would have loved to be able
to map out old vs new URLs within your SEO module so that it sets redirects.
On Fri, Mar 4, 2016 at 1:11 AM khalwat [email protected] wrote:
There's already a TON of micro data in the plugin -- what specifically do
you mean?As for redirects, I thought about it, but there are some very good (free)
plugins out there already that do it well. What would you use it for?—
Reply to this email directly or view it on GitHub
#32 (comment)
.
from seomatic.
As for "missing" micro data, are you using a custom template for rendering the SEO meta or something? Because there's a bunch of metadata missing from your site that by default is there.
For redirecting:
https://github.com/davist11/craft-reroute
https://github.com/rkingon/Craft-Plugin--Redirect-Manager
from seomatic.
Looks like I had created a template in SEOMatic when testing- I've deleted
it but still not seeing microdata. Or are you talking about HTML templates?
Thanks for the links to the redirects.
Also, I've added my google analytics ID but it's not tracking pages. Dev mode is off.
On Fri, Mar 4, 2016 at 10:34 AM khalwat [email protected] wrote:
As for "missing" micro data, are you using a custom template for rendering
the SEO meta or something? Because there's a bunch of metadata missing from
your site that by default is there.For redirecting:
https://github.com/davist11/craft-reroute
https://github.com/rkingon/Craft-Plugin--Redirect-Manager
—
Reply to this email directly or view it on GitHub
#32 (comment)
.
from seomatic.
When you call SEOmatic, it should be just:
{% hook 'seomaticRender' %}
Make sure you are not doing this anywhere:
{% set seomaticTemplatePath = 'path/template' %}
You're still using some kind of custom template that you must have removed a bunch of the SEO from.
This is the same reason why Google Analytics is not included.
from seomatic.
Related Issues (20)
- SEOmatic caches meta content with "generate-transform" action urls HOT 1
- FeedMe Support HOT 1
- Edit content SEO for Digital Product types
- Sitemaps issue with multisite
- Enabling robots in devMode/local
- Overriding seomatic.meta.robots value has no effect on <meta> tag (in production environment) HOT 1
- Referrer-Policy should be updated
- Trying to get property 'sourceName' of non-object when clicking breadcrumb in seo content HOT 3
- Turn off x-powered-by: SEOmatic
- turn off Automatic Render Enabled and manually render scripts HOT 1
- Google Analytics showing /actions/seomatic URLs for some reason
- “Internal server error” when piggybacking (in PHP) HOT 2
- Google Tag Manager code ends up in body instead of head HOT 1
- Environment Variables/Aliases not parsed for humans.txt, though proposed in Text Field HOT 1
- Provide data in a structured way on GraphQL endpoint HOT 2
- Can't retrieve (GQL) or save (control panel) humans template HOT 1
- Section Entries Not Populating to Content SEO and Sitemap.xml HOT 2
- Pagination for Sitemaps / Sitemap Indexes HOT 2
- Sitemap Page loads slowly HOT 1
- Seo images don't update mtime when asset settings are changed. HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from seomatic.