Programatically create aspx page?

M

Mark B

Is there a VB.NET command to programmatically create a new aspx page on the
server from a string variable strMyNewHTML?
 
M

Mike Lovell

Is there a VB.NET command to programmatically create a new aspx page on
the server from a string variable strMyNewHTML?

Might have to explain what you're trying to do a bit more. But you can
programmatically create content yes.

Using the 'Response' stream.

Response.Write(strMyNewHTML)

In Page_Load will dynamically display what's in there.
 
M

Mike Lovell

Yeah but I need to save the aspx file onto the disk on the server so later

Yes, you could just save that information using:

File.WriteAllText(filename, data);

In the System.IO Namespace.

It does depend on what you're trying to do though, you can carry out URL
redirection and things like this in your 'web.config' - Where you can have
different URL's correspond to a single page, which you can alter based on
which URL the browser called.
 
M

Mark B

Is that URL redirection via webconfig SEO friendly?

Basically our concept is similar to a dictionary.com one where they have:

dictionary.com/truck
dictionary.com/trunk
dictionary.com/try

etc

and each page is laden with the particular keyword. I thought the only way
they did this was by creating separate pages for each.
 
G

Guest

Is that URL redirection via webconfig SEO friendly?

Basically our concept is similar to a dictionary.com one where they have:

dictionary.com/truck
dictionary.com/trunk
dictionary.com/try

etc

and each page is laden with the particular keyword. I thought the only way
they did this was by creating separate pages for each.




Yes, you could just save that information using:
File.WriteAllText(filename, data);
In the System.IO Namespace.
It does depend on what you're trying to do though, you can carry out URL
redirection and things like this in your 'web.config' - Where you can have
different URL's correspond to a single page, which you can alter based on
which URL the browser called.

Mark,

Don't be crazy about SEO-friendly URLs. A link like /page.aspx?
id=truck has the same meaning as a link like /truck.

If you definitely want to have "short" URLs, then you can either use
httpModules (google for "URL Rewriting"), or ASP.NET MVC. In both
cases the idea is not to create a new aspx page, but return an output
as it would be a new page.

Let me know if you have further questions regarding this

Hope this helps
 
M

Mark B

P

Patrice

Hello,

Rather than throwing at us some ideas to achieve some unknown goal, could
you start by explaining what you are trying to do ?

For now, my understanding is that you would like to have "friendly" urls
which is done by using what is called "url rewriting". See for example :
http://msdn.microsoft.com/en-us/library/ms972974.aspx

The idea is that the request to a friendly url is intercepted and then your
url rewriting module directs transparently this request to an actual page
with possibly some url parts as query string parameters...

Also having the big picture could help to raise better suggestion. Do you
want to do this only for search engine or do you want also to actually use
those friendly urls on your site ? What is the benefit you are looking for ?
 
M

Mark B

I'd like to just show you the website URL (picture paints a thousand words)
but it's not quite done yet. Hopefully in the next few days ... then I can
post it to this thread.
 
P

Patrice

Ok, have you checked Google for webmaster tools ? AFAIK they provide you
with quite a bunch of tools including the ability to see how your site is
seen by the Google indexer and guides about best practices...

The key point here is to understand and measure how the change you made
impacts your site rather than doing random changes and have no way to find
out if it improved (or possibly damaged) your site ranking...
 
G

Guest

So if in the robots.txt I had:

www.mysite.com/definitions/default.aspx?id=truckwww.mysite.com/definitions/default.aspx?id=trunkwww.mysite.com/definitions/default.aspx?id=try

they'd all be stored separately in Google? It would be nice if they did --  
save us a lot of work and disk space.

So I would need to programmatically re-write the robots.txt whenever another
word was added to the database? Or would it suffice if my homepage had all
these links on (created programmatically)?

The robots.txt file is used to define what content can be excluded by
search engine spiders. You don't need to define every single URL
there. To index all pages, you either should delete robots.txt or put
there just two following lines

User-agent: *
Disallow:

I think it would not be a problem if you enumerate all links in that
file, but I'm pretty sure that this will not help to increase any
ranking.
 
M

Mark B

OK thanks.

So if in the robots.txt I had:

www.mysite.com/definitions/default.aspx?id=truckwww.mysite.com/definitions/default.aspx?id=trunkwww.mysite.com/definitions/default.aspx?id=try

they'd all be stored separately in Google? It would be nice if they did --
save us a lot of work and disk space.

So I would need to programmatically re-write the robots.txt whenever
another
word was added to the database? Or would it suffice if my homepage had all
these links on (created programmatically)?

The robots.txt file is used to define what content can be excluded by
search engine spiders. You don't need to define every single URL
there. To index all pages, you either should delete robots.txt or put
there just two following lines

User-agent: *
Disallow:

I think it would not be a problem if you enumerate all links in that
file, but I'm pretty sure that this will not help to increase any
ranking.
 
A

Andrew Morton

Mark said:
So if in the robots.txt I had:

www.mysite.com/definitions/default.aspx?id=truck
www.mysite.com/definitions/default.aspx?id=trunk
www.mysite.com/definitions/default.aspx?id=try

they'd all be stored separately in Google? It would be nice if they
did -- save us a lot of work and disk space.

So I would need to programmatically re-write the robots.txt whenever
another word was added to the database? Or would it suffice if my
homepage had all these links on (created programmatically)?

I think you're looking for sitemaps:

"About Sitemaps - Sitemaps are a way to tell Google about pages on your site
we might not otherwise discover..."
http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=156184

(Works for other search engines too.)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,995
Messages
2,570,226
Members
46,815
Latest member
treekmostly22

Latest Threads

Top