M
Mark B
Is there a VB.NET command to programmatically create a new aspx page on the
server from a string variable strMyNewHTML?
server from a string variable strMyNewHTML?
the server from a string variable strMyNewHTML?
someone else can go to www.domain.com/mypage102.aspx
Is that URL redirection via webconfig SEO friendly?
Basically our concept is similar to a dictionary.com one where they have:
dictionary.com/truck
dictionary.com/trunk
dictionary.com/try
etc
and each page is laden with the particular keyword. I thought the only way
they did this was by creating separate pages for each.
Yes, you could just save that information using:File.WriteAllText(filename, data);In the System.IO Namespace.It does depend on what you're trying to do though, you can carry out URL
redirection and things like this in your 'web.config' - Where you can have
different URL's correspond to a single page, which you can alter based on
which URL the browser called.
So if in the robots.txt I had:
www.mysite.com/definitions/default.aspx?id=truckwww.mysite.com/definitions/default.aspx?id=trunkwww.mysite.com/definitions/default.aspx?id=try
they'd all be stored separately in Google? It would be nice if they did --
save us a lot of work and disk space.
So I would need to programmatically re-write the robots.txt whenever another
word was added to the database? Or would it suffice if my homepage had all
these links on (created programmatically)?
So if in the robots.txt I had:
www.mysite.com/definitions/default.aspx?id=truckwww.mysite.com/definitions/default.aspx?id=trunkwww.mysite.com/definitions/default.aspx?id=try
they'd all be stored separately in Google? It would be nice if they did --
save us a lot of work and disk space.
So I would need to programmatically re-write the robots.txt whenever
another
word was added to the database? Or would it suffice if my homepage had all
these links on (created programmatically)?
Mark said:So if in the robots.txt I had:
www.mysite.com/definitions/default.aspx?id=truck
www.mysite.com/definitions/default.aspx?id=trunk
www.mysite.com/definitions/default.aspx?id=try
they'd all be stored separately in Google? It would be nice if they
did -- save us a lot of work and disk space.
So I would need to programmatically re-write the robots.txt whenever
another word was added to the database? Or would it suffice if my
homepage had all these links on (created programmatically)?
Want to reply to this thread or ask your own question?
You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.