tail -f in a webbrowser - javascript

I've created a Python script that monitors a logfile for changes (like tail -f) and displays it on a console. I would like to access the output of the Python script in a webbrowser. What would I need to create this? I was thinking about using Django and jQuery. Any tips or examples are greatly appreciated.

First create a python script that monitors the log file for changes. If you only need this for debugging - testing purposes, then it is an overkill to use Django or another web framework. It is very easy to implement Http Web server functionality using sockets. Whenever an Http GET request is coming, serve only the difference from the different request. In order to achieve this you need to store in memory the status of every request coming (e.g.number of last line in the file).
The jQuery part is actually quite easy. Set up a timer with setTimeout function. Something like this will do:
function doUpdate() {
$.ajax({type: "GET", url : tailServiceUrl,
success: function (data) {
if (data.length > 4)
{
// Data are assumed to be in HTML format
// Return something like <p/> in case of no updates
$("#logOutputDiv").append(data);
}
setTimeout("doUpdate()", 2000);
}});
}
setTimeout("doUpdate()", 2000);
You can also create callbacks for error and timeout to report a problem with the server.

I don't have any Python or Django experience but I'd assume you can make a system call like tail in Python and relay the details.
From there, I'd use a jQuery .ajax() call with a javascript setInterval() loop to your Python script and output the results to a div on the web page. Overall a pretty simple solution.
In this instance, you really wouldn't need to use an open tail -f system call because the nature of the JS setInterval() method, the Python script will be called over and over again until the JS clearInterval() method is called. You'll aggregate your script details in either Python or JS depending where you want to do the work. I'd suggestion Python since you'd have more robust features at your fingertips and you would send less data via the AJAX call. Theoretically, there probably shouldn't be too much logic needed in the jQuery code on the front end. Just display the data.
http://api.jquery.com/jQuery.ajax/
http://www.w3schools.com/jsref/met_win_setinterval.asp

Why don't you output the data to a HTML file? You could run a cron job to run your script which would in turn spurt out a HTML file which could be accesses from the browser.

The most voted answer works ok, but there is a more agnostic way to do this.
You can use https://github.com/mthenw/frontail
Just install it and invoke it with the files that you want to watch.
frontail /var/log/syslog /var/log/another_log
then visit http://127.0.0.1:9001
I hope this can help others.

Related

Why would replacing a div via a GET request through AJAX cause my website to slow down so much compared to no JavaScript at all?

So the whole reason I am using AJAX is to make page changes seem smoother. However I have realized that using AJAX is actually slowing down the website significantly. I am using localhost with apache. I am running php on the backend to access a database for various pages.
It's taken up to 5 seconds just to load a single page.
Here is some AJAX:
$(function() {
$(".menu_nav").click(function() {
$.ajax({
type: "GET",
url: "menu.php",
dataType: 'html',
success: function(data) {
var toInsert = $(data).filter(".placeholder");
var style = $(data).filter("#style");
$("#style").replaceWith(style);
$(".placeholder").replaceWith(toInsert);
window.scrollTo(0,0);
}
});
});
});
'menu_nav' and 'home_nav' are both divs with click events attached to them, and on click they are performing a GET request to the server and asking for a div in the .php as well as it's styling sheet. It then will replace the div and style sheet on this page with what it retrieved from the GET request. Where I am having trouble understanding though is why is this taking up to 5 seconds to perform the GET request, whereas without any javascript I am getting minuscule load times, just less "pretty"?
I looked at the timeline and network tabs in the web inspector, and had noticed that every time I perform one of these requests, I get a new file from the server, rather than reading the one I've already got, which makes sense because there might be new data in the page since the last visit, however I don't see a duplicate being added to the list of sources when I am not using AJAX. For example:
Whereas without AJAX, there is only one. This makes sense since I am initiating a GET request to the server, but the same is happening when you click a link without AJAX.
Regardless, I still don't understand what is making it so slow as opposed to not using JavaScript. I understand it is doing more in addition to just a GET request, but is filtering and replacing text after a response really what is causing this issue?
Side question: This is outside the scope of this question, but in regards to AJAX, when I perform a request to the server, is the PHP within the file still executing before it gives me the HTML? So on pages where a user must have certain permissions, will the PHP that catches that still be ran?
EDIT: I am hosting a MySQL database through a free subscription to a cloud hosting service. This issue occurs when I access my website through both localhost, and when accessing the website that way deployed via the free cloud hosting service, though it is way slower when I use the cloud service. I am also using various resources from the MAMP (MacOS Apache, MySQL, PHP; If you're on windows and interested, WAMP is also available fore free) installation.
I'm not sure what is causing your slowness issues, but you could try doing some profiling to narrow down the issue. My guess is that while changing your code to use ajax, you also introduced or revealed some bug that's causing this slowness issue.
Is there an issue with the javascript? You can place console.time() and console.timeEnd() in different places to see how long a chunk of javascript takes to execute. (e.g. at the start and end of your ajax callback). Based on what you posted, this is likely not the issue, but you can always double check.
Is it PHP that's running slow? You can use similar profiling functions in PHP to make sure it's not hanging on something.
Are there network issues? You could, for example, log the timestamp of when javascript sent the request and when PHP received it, and vice versa. (this should work OK on localhost, but in other environments you have to be careful of clocks being out of sync)
There's a lot that could be going wrong here, so its hard to give a more specific answer, but hopefully that gives you some tools to help you start looking.
As for your side question: you are correct - PHP will start sending the HTML while it continues to execute. For example:
<div>
<?php someLongBlockingFunction(); ?>
</div>
<div> will get sent to the browser, then PHP will stall on the long-running-function before it finally sends out the ending </div>. The browser will piece together the chunks, and your event listener won't get called until PHP has finished sending the entire file.

What's the easiest or idiomatic way to mock HTTP requests in 3rd party js libraries with node / js

When I've coded in Ruby or Python, I've been able to use libraries like VCR that intercept HTTP requests, and record them, so when for example I'm hitting an 3rd party API in tests, I can save that response as a fixture instead of manually building a huge mock objects to check behaviour against.
It's not perfect, but it has saved a load of time when I've been exploring which API requests to make against a third party API, (often wrapping a 3rd party library), then writing tests to check this behaviour.
What's the closest thing in JS these days to this?
I'm looking for an open source tool I can require in my test files, so when I run tests where I might call methods on third party APIs, I don't make expensive, slow HTTP requests. I imagine the code might look a bit like:
it('does something I expect it to', () {
// set up some state I care about
let someVar = someSetupCode()
let library = thirdPartyLib({creds: 'somecreds'})
library.someMethod()
// check state has changed
expect(someVar.value).toBe('what I Expect after calling someMethod')
})
Where here, when I call library.someMethod(), instead of hitting actual servers, I'm checking against the values the server would be returning, that I've saved previously.
Monkey patching an existing library or function
I see things fetch-vcr, or axios-vcr, but these seem to rely on explicitly reaching into a library to replace say, a call to fetch with the http-intercepting version instead, be reading a 'cassette' file containing the canned response.
I'm looking for a way to avoid patching 3rd party code if I can help it, a this is how I understand VCR works for other languages.
Presumably, if there's an HTTP client built somewhere into node then that would be the place you'd patch a function - I haven't come across a specific library that does this.
Running an entire HTTP server
Alternatively I can see libraries like vcr.js, or yakbak, which essentially set up an HTTP server which serves JSON blobs you define, at various urls, like serving a saved users.json file at http://localhost:8100/users/
This is okay, but again, if I don't need to spin up a whole HTTP server, and make actual HTTP requests, that would be wonderful.
Oh, hang on, it looks like sepia from linkedin works well for nodejs at least.
I haven't looked into it too much, but I'd welcome comments if you have been using it.
Probably SoapUI works for you. Although its name, it also works with REST API.

Is there any option for setInterval()?

I am reading a file continuously after a some time as
setInterval(function(){
$.getJSON("json/someFile.json", function(data){
// Some code
});
}, 5000);
I am reading this file continuously after a delay as it is getting updated in other part of the code. I want to avoid using setInterval().
Is there any way, by which I will be able to know that the file is updated and read it only when it is updated.
Firstly, setInterval is a native JavaScript method. It does not come from jQuery. Second what you've done is called polling. Meaning that you request some information periodically in order to keep it up to date. The alternative is using a WebSockets. Websockets are a two way connection between the client and the server, which can both push and receive messages. This way, you can send a socket message to the client whenever the file is updated in the backend.
I'm assuming you're talking about client side code. Then no: there is no way to "watch" a json file like you could have a file watcher in "regular" applications. You need either:
Interval-based checking as you're doing now. However, as suggested in comments by #George, you might be better off if you use setTimeout and only re-fire the Ajax request in specific situations (e.g. on success, perhaps not on failures); With your current approach the function may run on the interval, but if it takes longer than the interval timing to respond you get a build-up of requests;
Websockets (potentially with fallback to something like long-polling), perhaps using another library for that + the server-side part of this solution;
No other way I'm afraid.
As a footnote, this hasn't got much to do with jQuery. First, the setInterval is not of jQuery but a regular window function, and second the problem of "watching" a file isn't specific to how you're doing the Ajax call (you're using jQuery, but you could use another lib for it too).

Qunit Javascript client logging

we have small wrapper client logging framework to enable javascript client logging.We have exposed different function like LogError,LogWarn,LogInfo in Javascript.
JavaScriptLogger.LogError(exception)
{
postAjax(exception);
WriteConsoleLog(exception)
}
When ever user logs a error using JavaScriptLogger.LogError(exception),two things get executed:
1. postAjax(exception) : An ajax post is made to specified server URL.
2. WriteConsoleLog(exception) : Writes an exception Console.Error/Console.Warn/Console.Log,depending upon errortype passed by user.
Now I wanna have a unit test cases for LogError,LogWarn,LogInfo function using Qunit framework.
Could somebody provide me a suggestion how to start and how to go about.
Regards,
SCP
Well, first off, I hope you understand that someone could flood your server logs this way, and hopefully you have something in place to prevent this.
Second, you would want to mock that Ajax call so that you aren't relying on your server to work (eliminating one source of potential errors). You could use something like Mockjax for that.
After that, you would want to probably run all of those tests in an asyncTest call (versus a simple test) since the Ajax call is still supposed to be asynchronous.
http://api.qunitjs.com/asyncTest/
(PS Sorry if my formatting is bad, typing this on my phone.)

What is the best way to handle streaming JS content in browser?

Imagine we have a server side application that generates streaming content full of JavaScript commands. The easiest way for me to show the example application is to use Python/Flask, however you can perform it using any language just flushing the output after each iteration. So, for a sample server side application:
from time import sleep from flask import Response
#app.route('/stream', methods=['POST']) def stream():
def generate():
for i in range(10):
sleep(1)
yield 'console.log("Iteration: %d");\n' % i
return Response(generate(), mimetype='application/javascript')
which returns (during 10 seconds with 1 second pauses) this kind of output:
console.log("Iteration: 0");
console.log("Iteration: 1");
console.log("Iteration: 2");
...
console.log("Iteration: 9");
I need to create a "parent" HTML/JavaScript page which handles and executes these commands on-the-fly, i.e. not waiting until all 10 iterations will be loaded. Also, it should be able to serve POST requests to the mentioned server side application.
Here are the options which I have tried.
I tested jQuery Ajax method with different options but it still
needs full generated output to execute all commands at once.
The other idea was to use iframe. It works fine but in order to
use it I need to rephrase my output from console.log("Iteration:
0"); to <script language="JavaScript">console.log("Iteration:
0");</script> with content type text/html; and also to simulate
POST form submission to the target iframe.
I have read about WebSockets. However, since this technology is not
absolutely supported at the moment, and my application should be
able to work with on-the-fly content already now, I refused to deal
with it.
Another very important thing: the output should be a stream, since server side application works with a long lasting process; so make setTimeout(function() { $.ajax(...); }, 1000); is not the solution.
To summarize, I have tried several options but simple iframe is the only solution which really works at the moment. Otherwise, most probably I am missing something. Any thoughts and constructive ideas are very much appreciated.
Thank you in advance!
long-Polling and comet are options, but these are hacks. The Iframe method you mentioned isn't terrible, but has some state issues if you need to recover the connection.
I would encourage you to reconsider web sockets. There is a lovely shim available on github which uses flash (which has had socket support for some time now) as a fall back. You can write client side code as if web sockets exist, and the shim adds it to browsers that don't support it. Great!

Categories

Resources