PHP Mailer, blocking whole server execution while sending an e-mail - javascript

First, I've seen this question: Send mail without blocking 'execution'
But, I need to know why PHP is not a multi-threading process but has no problem if two people try to connect at the same time.
I am calling the mail script by an Ajax call actually :
$.ajax({
url: 'sendMailTemplate.php',
type: 'POST',
data: formData,
processData: false,
contentType: false,
success: function (data) {}
...
My problem with PHP mailer is that, when I'm sending an email if this takes 10 seconds, then for 10 seconds i can't use any feature of my website ( and the website is like down for everyone for 10 seconds).
Should I try to use cron jobs? Is there any tutorial for PHP mailer + cron jobs? What's the best way other than cron jobs?

Should I try to use cron jobs ?
Yes
Is there any tutorial for PHP mailer + cron jobs ?
I tried my own logic by using multiple reference :
Below logic execute like queue and also sending time base on your cronjob & you hrly mailing limit.
Mailing : when you click on send email that time you have store all emails into "temp_table" of your database with their "to,cc,bcc,suject,body" columns.
Write cron job for hrly(your time) basic which will execute in certain time span.
In cron script you have to write code which will take emails (set query limit to take no of records) from "temp_tables" & send to the recipient. After success full sending, delete those mails from "temp_table" of you database.
Whats the best way other than cron jobs ?
You have contact to your service provider to increase hrly mailing limit, increase your server speed, php.ini : change loading time (this option not works every time), refer different language like python

Just switch to a smtpd that allow placing mails in queue and sending them asynchronously.

Related

How to redirect in JavaScript to another website after establishing an authenticated session using PHP (Codeigniter)

SOLVED
The title wasn't enough for the question, I'll elaborate here.
The Problem
I have two systems lets call them system A and system B. System A is a Petite-vue/Codeigniter stack. System B could be a Codeigniter or Laravel back-end (front-end could vary).
Here is a graph showing the setup (R1 and R2 will be explained below).
NOTE: In the graph below R2 is sent from the back-end to system B.
In system A, I am making an asynchronous Fetch request, lets call this request R1, from the JavaScript (petite-vue) code to the PHP (Codeigniter), which is in the same system (it being system A). (Code below)
function Connect_To_System(url){
fetch("<?= base_url("connect") ?>",
{
method: "POST",
body: JSON.stringify( {URL: url} )
})
.then(res => res.json())
.then(data => {
// do something with response data like showing a message
})
}
R1 is handled in the PHP (Codeigniter) code, which establishes an authenticated session with system B by sending a cURL request, lets call this request R2, to system B. After R2 response returns an HTTP 200 status, I want to redirect the user from system A to system B in a new tab. (Code below, note that I'm using REST_Controller)
public function connect_post(){
$data = json_decode($this->post()[0]);
$url = $data->URL;
$url_login = $url.'/auth/login';
$token = $this->session->tempdata('token');
$username = $this->current_user['emp_no'];
$digest = md5($username.$token);
$params= array(
"username" => $username,
"token" => $token
);
$result = $this->postCURL($url_login, $params);
// Redirect in a new tab when status is 200 somehow
// Return a response to the JavaScript
}
The problem is that I can't redirect the user to another page from the PHP (Codeigniter), because it was called by R1. Thus, I must redirect the user from the View that made R1 using JavaScript.
However, the solution to first problem doesn't work because the session that was established by R2 is tied to PHP (Codeigniter), or cURL I can't really tell, in other words it's the server that established the session not the user. And redirecting the user using JavaScript will tie the redirection to the user, or whatever the redirect method is.
Possible Solution (Not Preferred EDIT: It is preferred)
The only functional solution is to establish the session with system B from the JavaScript and then redirect the user, which is what I'm currently doing. But the problem is that I'm exposing the authentication data to whom ever simply decides to open the browser inspect. That's why I'm trying to keep all the important data in the back-end PHP code.
The Main Goal
All I want is a way to keep the authentication data hidden, whatever the method may be, if that's possible really.
Thank you in advance.
As it turns out, the proposed solution in the question actually is preferred. Since the token of the user is so personalized and it is even stored in the server session and stored in the database, it would so difficult to authenticate you're self using someone else's authentication data. Thus, there is no major security vulnerability. The idea of exposed data scared me that it blinded me from seeing that exposing the token isn't a huge security vulnerability.

Several asynchronous requests from javascript to PHP apache not been asynchronous

This behavior was not present all the time, just come from nothing about a month ago and then disappeared suddenly. The problem is that I can't identify what happened. I have no server debug tool because only take place in production.
Roughly 100 ajax request are triggered at the same time using some loop like :
let url = "example.com/";
var methods = ["method1", "method2", "method3", "method4"]; //Roughly 100
$.each(methods, function(index, value) {
$.ajax({
url: url + value,
method: "POST",
data: { params: "whatever", otherParams: "whatever" }
}).done(function(data) {
console.log(data);
});
});
In server side (apache+php) there are selects, updates and inserts in a relational database. Each request performs an individual thread since apache is hearing.
When I see the network console, all requests starts at the same time (roughly), but here is the problem. The response happen one after the other finish. If request 1 starts at 0 and spend 5 seconds, request 2 starts at 5, and request 3 starts when request 2 was finished. All browser have the same behavior.
The best logic explanation I thought, is that database is blocking some table when performs an update or insert. Some tables are huge and without indexes could spend too much time. Well, staging environment points to the same database and works perfectly asynchronously. So what is going on? It is possible that php or Apache could be stucked in this way for some reason? I thought other crazy idea that is some writing problems with log files in the OS (debian) but I have no idea how that works. So I would be glad if anyone could give me any suggestion. Maybe I could reproduce the problem in a controlled environment and do something to prevent this can happen again.
Some additional information, the API have two clients one in angular the other in javascript+php. It's exactly the same behavior with both clients.

AJAX queries failing for exactly 60 seconds per time

I have a Javascript function that runs every 5 seconds and requests information from the same server via a jQuery AJAX call. The function runs indefinitely once the page is loaded.
For some reason the AJAX query is failing about once every minute or two, and showing
ERR_EMPTY_RESPONSE
in the console. The odd thing is, it fails for exactly 60 seconds, then starts working fine for another minute or two.
So far I've tried with no success:
Different browser
Different internet connection
Changing the polling time of the function. (Still fails for 60 second intervals. eg run every 10 seconds, it fails 6 times. Or 5x12 or 1x60)
Web searches which suggesting flushing ip settings on my computer
I never had any problem on my last server which was a VPS. I'm now running this off shared hosting with GoDaddy and wonder if there's a problem at that end. Other sites and AJAX calls to the server are working fine during downtimes though.
I also used to run the site over HTTPS, now it's over plain HTTP only. Not sure if relevant.
Here's the guts of the function:
var interval = null;
function checkOrders() {
interval = window.setInterval(function () {
$.ajax({
type: "POST",
dataType: "json",
url: "http://www.chipshop.co.nz/ajax/check_orders.php",
data: {shopid : 699},
error: function(errorData) {
//handle error
},
success: function(data) {
//handle success
}
});
}, 5000); // repeat until switched off, polling every 5 seconds
}
Solved: It turned out the problem was with GoDaddy hosting. Too many POST requests resulted in the 60 second 'ban' from accessing that file. Changing to GET avoided this.
This page contains the answer from user emrys57 :
For me, the problem was caused by the hosting company (Godaddy)
treating POST operations which had substantial response data (anything
more than tens of kilobytes) as some sort of security threat. If more
than 6 of these occurred in one minute, the host refused to execute
the PHP code that responded to the POST request during the next
minute. I'm not entirely sure what the host did instead, but I did
see, with tcpdump, a TCP reset packet coming as the response to a POST
request from the browser. This caused the http status code returned in
a jqXHR object to be 0.
Changing the operations from POST to GET fixed the problem. It's not
clear why Godaddy impose this limit, but changing the code was easier
than changing the host.

display number of message dynamically using javascript or asp.net

I will do my best to explain my problem to avoid people pointing me into different directions.
I got an assignment from business people. I don't know the terminology. It is very similar to email notification, message notification on facebook, notification on social media games.
For example, people are sending 20 email messages 5 minutes ago. the screen will display 20 (see attachment). Now, 3 more messages have arrived, the web page should update the number to 23.
Facebook has similar concepts when our friends like/comment message. The notification changes. Same thing is true on social media game. Any status changes on our game, it will reflect it.
I kind of have idea on how to do it cosmetically (on CSS). How to do it using javascript/asp.net. Do I need to postback in order to refresh the message. I never pay attention to that on facebook/yahoo email/social media games. All I know is something is happening, the webpage is updating the status.
Sorry the question is too broad. If someone can help me to point to the right direction, I appreciate any help
HTML5 introduced a very interesting concept call Server-Sent Events in which server can dynamically connect to the client and send data.
Eg:
var source = new EventSource("demo_sse.asp");
source.onmessage = function(event) {
document.getElementById("result").innerHTML = event.data + "<br>";
};
And on server side you can write,
<%
Response.ContentType = "text/event-stream"
Response.Expires = -1
<!--Writing "data:" is important-->
Response.Write("data: The server time is: " & now())
Response.Flush()
%>
However, some old browsers may not support this.
One other way to accomplish this task is to use Ajax call as,
function checkNewMessages(totalMessages){
return $.ajax({
url: 'demo.asp',
type: 'GET',
cache: false,
data: {
totalMessages: totalMessage;
}
});
}
checkNewMessages(totalMessages).success(function (data) {
// display the data wherever you want
});
//Checking for new messages every 5 seconds
setInterval(checkNewMessages(totalMessages,5000));
Whatever you write within your Write() in server side will be displayed here. Now to constantly check for the new messages you can call the above ajax function with the help of setInterval()
There are many ways to do this, depending on how real time you need it to be.
The most common way is to use JavaScript with an XmlHttpRequest to call an ASP.NET page which returns the number of messages, I recommend you use a JSON object for this. The benefit of this approach allows you to request data from the server without the user experiencing a full page refresh. You can use JavaScript to set it to call every x seconds depending on your requirements.
Collectively that is known as AJAX, using a JavaScript library such as JQuery can make this much easier.

How to propagate the ajax.abort() to the controller

I have a problem with propagating the aborting of my ajax call to the controller.
I have a JavaScript function in View code that may be called by a user at any point. This then transmits a value to the controller using ajax. The controller then does a time consuming opertion on the Input and returns a result.
What I want is that when user calls the function if it is already doing the time consuming opertation to either:
Stop and to start again with the new Input. In essence I need to propagate the abort call up to my controller code and deal with it accordingly
OR
I need to be able to run multiple simultaneous instances of the controller function.
Is this possible? and what is the best way to do it.
View Code
var AJAXSetPalette = null;
function DoSometing(Input) {
if (AJAXSetPalette)
AJAXSetPalette.abort();
AJAXSetPalette = $.ajax({
type: "POST",
url: "ImagesAnalysis/DoSomething",
datatype: "json",
traditional: true,
data: Input,
success: function (Data) {
DoJSFunction(Data);
}
return;
}
Controller
public int DoSomething(int Input)
{
int RetVal
//Calculate RetVal from Input, very Time Consuming
Return RetVal
}
This is a client-server issue. Connections between a client and a server in HTTP are not persistent. The client opens a connection to the server and makes a request. The connection is then closed. The server processes the request, opens a connection back to the client, and sends the response.
Note: As of HTTP 1.1, this is not technically true any more. Connections are actually persisted in many cases over HTTP 1.1, but merely to reduce the delay from having to re-establish the connection. In principle, both the client and server still behave as if the connection has been closed.
The point is that once your AJAX request is sent, the server is merrily on its way processing the request. If the client should abort the request, there's no notification given to the server. When the server attempts to send the response, it will simply be refused, and the server will disregard it and move on to the next request.
That's how the TCP/IP and HTTP protocols were designed to behave, and it's what makes the Internet possible as a loosely connected network of nodes that can drop off or come online at will.
Long and short, there's no way to cancel the request on the server-side from the client once it's been sent.
For your scenario, the best thing would be to simply disable the user's ability to issue another request until the server has responded or some timeout period has elapsed. If the request is so resource intensive and you can call it as many times as you want as fast as you want, that's a huge opportunity for a DoS attack

Categories

Resources