J
Jonah Olsson
Dear All,
I'm currently developing a solution where large amounts of personalised
emails are being created (and no, this is not spam...) on the ASP.NET
platform and being delivered by a Debian Linux server running Qmail and
mySQL. Currently the .NET application just connects to the SMTP-port on the
Linux server and sends each mail one by one. This creates an awful lot of
traffic and isn't really a good way of handling >100.000 emails/month.
I would like a solution where all this data first being prepared on the .NET
platform, and then transferred to the Linux platform to be handled and sent.
But how should I solve this both secure/reliable and efficient?
So basically I have two questions;
Should I prepare a large XML dataset and ship this to the Linux server to be
handled locally (Perl + mySQL + Qmail). This would need some kind of status
check since if the Linux server would go down, some mail might already have
been sent.
Can I use Web Services here? If so, I suppose I should create two Web
Services. One on the Linux platform to receive the dataset with personalised
emails, and one on the .NET platform to receive status and results.
Am I missing something out here? Qmail is currently the most reliable part
here I think, since it basically never looses mail even if the network or
server goes down. But the data sent to Qmail might be lost due to network
trouble etc. This is an important part of the problem.
Someone with similar experience?
Thanks for any kind of help/hints!
Best regards
Jonah Olsson
Generation Software
I'm currently developing a solution where large amounts of personalised
emails are being created (and no, this is not spam...) on the ASP.NET
platform and being delivered by a Debian Linux server running Qmail and
mySQL. Currently the .NET application just connects to the SMTP-port on the
Linux server and sends each mail one by one. This creates an awful lot of
traffic and isn't really a good way of handling >100.000 emails/month.
I would like a solution where all this data first being prepared on the .NET
platform, and then transferred to the Linux platform to be handled and sent.
But how should I solve this both secure/reliable and efficient?
So basically I have two questions;
Should I prepare a large XML dataset and ship this to the Linux server to be
handled locally (Perl + mySQL + Qmail). This would need some kind of status
check since if the Linux server would go down, some mail might already have
been sent.
Can I use Web Services here? If so, I suppose I should create two Web
Services. One on the Linux platform to receive the dataset with personalised
emails, and one on the .NET platform to receive status and results.
Am I missing something out here? Qmail is currently the most reliable part
here I think, since it basically never looses mail even if the network or
server goes down. But the data sent to Qmail might be lost due to network
trouble etc. This is an important part of the problem.
Someone with similar experience?
Thanks for any kind of help/hints!
Best regards
Jonah Olsson
Generation Software