I am using Application Insights to track events in my web pages:
appInsights.trackEvent("my-event", { test: true });
However I can see that each entry in the log, collects some info regarding several other things like:
User Id
Session Id
Operation name
The last one is sensitive as I can get the name of the computer or some other stuff. In order to comply to the GDPR, I wanna strip out those info from my log.
How do I tell Application Insights, to process the data before logging them? In my case, I would like to get access to the object which will be sent out by trackEvent and modify it before it is transmitted.
You can use TelemetryInitializers for that. They allow you to modify items before they are send to Application Insights
In your case it could be as simple as
appInsights.queue.push(function () {
appInsights.context.addTelemetryInitializer(function (envelope) {
envelope.tags['ai.operation.name'] = 'xxx';
});
Related
Background: Using MEAN stack to build a web app, I am still learning.
The Issue: I find the following confusing. Say I have a user logged in (I am using Passport.js). From Angular I can retrieve it querying my Node.js server.
What I am doing now is something similar to:
app.get('/userLogged',function(req,res){
res.json({req.user});
});
This does not sound safe to me. I might be a novice, but I have seen this in many tutorials. With a console.log in the browser I can print all the info about the user, including the hashed password. My guess is that I should send a minimal set of information to the browser, filtering out the rest.
My Question: is this safe at all, or I am just leaving the door open to hackers?
Take a look at the concept of ViewModel. It represents the data you want to share publicly with an external user of the system.
What can be achieved in your case, is implementing the right view model out of the data model you store internally. A simplistic example illustrating this concept would be to create a view model for your user object that will pick the data you would like to send back :
// This function will return a different version
// of the `user` object having only a `name`
// and an `email` attribute.
var makeViewModel = function (user) {
return _.pick(user, ['name', 'email']);
}
You will then be able to construct the right view model on demand :
app.get('/user',function (req,res){
res.json(makeViewModel(req.user));
});
I have a Torii adapter that is posting my e.g. Facebook and Twitter authorization tokens back to my API to establish sessions. In the open() method of my adapter, I'd like to know the name of the provider to write some logic around how to handle the different types of providers. For example:
// app/torii-adapters/application.js
export default Ember.Object.extend({
open(authorization) {
if (this.provider.name === 'facebook-connect') {
var provider = 'facebook';
// Facebook specific logic
var data = { ... };
}
else if (this.provider.name === 'twitter-oauth2') {
var provider = 'twitter';
// Twitter specific logic
var data = { ... };
}
else {
throw new Error(`Unable to handle unknown provider: ${this.provider.name}`);
}
return POST(`/api/auth/${provider}`, data);
}
}
But, of course, this.provider.name is not correct. Is there a way to get the name of the provider used from inside an adapter method? Thanks in advance.
UPDATE: I think there are a couple ways to do it. The first way would be to set the provider name in localStorage (or sessionStorage) before calling open(), and then use that value in the above logic. For example:
localStorage.setItem('providerName', 'facebook-connect');
this.get('session').open('facebook-connect');
// later ...
const providerName = localStorage.getItem('providerName');
if (providerName === 'facebook-connect') {
// ...
}
Another way is to create separate adapters for the different providers. There is code in Torii to look for e.g. app-name/torii-adapters/facebook-connect.js before falling back on app-name/torii-adapters/application.js. I'll put my provider-specific logic in separate files and that will do the trick. However, I have common logic for storing, fetching, and closing the session, so I'm not sure where to put that now.
UPDATE 2: Torii has trouble finding the different adapters under torii-adapters (e.g. facebook-connect.js, twitter-oauth2.js). I was attempting to create a parent class for all my adapters that would contain the common functionality. Back to the drawing board...
UPDATE 3: As #Brou points out, and as I learned talking to the Torii team, fetching and closing the session can be done—regardless of the provider—in a common application adapter (app-name/torii-adapters/application.js) file. If you need provider-specific session-opening logic, you can have multiple additional adapters (e.g. app-name/torii-adapters/facebook-oauth2.js) that may subclass the application adapter (or not).
Regarding the session lifecycle in Torii: https://github.com/Vestorly/torii/issues/219
Regarding the multiple adapters pattern: https://github.com/Vestorly/torii/issues/221
Regarding the new authenticatedRoute() DSL and auto-sesssion-fetching in Torii 0.6.0: https://github.com/Vestorly/torii/issues/222
UPDATE 4: I've written up my findings and solution on my personal web site. It encapsulates some of the ideas from my original post, from #brou, and other sources. Please let me know in the comments if you have any questions. Thank you.
I'm not an expert, but I've studied simple-auth and torii twice in the last weeks. First, I realized that I needed to level up on too many things at the same time, and ended up delaying my login feature. Today, I'm back on this work for a week.
My question is: What is your specific logic about?
I am also implementing provider-agnostic processing AND later common processing.
This is the process I start implementing:
User authentication.
Basically, calling torii default providers to get that OAuth2 token.
User info retrieval.
Getting canonical information from FB/GG/LI APIs, in order to create as few sessions as possible for a single user across different providers. This is thus API-agnotic.
➜ I'd then do: custom sub-providers calling this._super(), then doing this retrieval.
User session fetching or session updates via my API.
Using the previous canonical user info. This should then be the same for any provider.
➜ I'd then do: a single (application.js) torii adapter.
User session persistence against page refresh.
Theoretically, using simple-auth's session implementation is enough.
Maybe the only difference between our works is that I don't need any authorizer for the moment as my back-end is not yet secured (I still run local).
We can keep in touch about our respective progress: this is my week task, so don't hesitate!
I'm working with ember 1.13.
Hope it helped,
Enjoy coding! 8-)
I have a simple app built using Node, Express, and Socket.io on the server side. My page queries my API when it needs to retrieve data that will not change, and uses WebSockets for getting live updates from the server for dynamic data. The app allows a single person, the "Supervisor", to send questions to any number of "Users" (unauthenticated) and view their answers as they trickle in. The Users send their data to the server using a POST request, and it is streamed to the Supervisor over a WebSocket. The server stores user data in a simple array, and uses an ES6 map of the items in the array (users) to objects containing each their questions and answers, like this:
class User {}
let users = [], qa = new Map();
io.on('connection', socket => {
let user = new User(socket.id);
users.push(user);
qa.set(user, {});
socket.on('question-answered', ({id, answer}) => {
let questionData = qa.get(user);
questionData[id] = answer;
qa.set(user, questionData);
});
});
This is obviously a very primitive way of handling data, but I don't see the need for additional complexity. The data doesn't need to persist across server crashes or restarts (the user's questions and answers are also stored in localStorage), and MongoDB and even Redis just seem like overkill for this kind of data.
So my question is, am I going about this the right way? Are there any points I'm missing? I just want a simple way to store data in memory and be able to access it through client-side GET requests and socket.io. Thank you for any help.
If an array and a map provide you the type of access you need to the data and you don't need crash persistence and you have an appropriate amount of memory to hold the amount of data, then you're done.
There is no need for more than that unless your needs (query, persistence, performance, multi-user, crash recovery, backup, etc...) require something more complicated. A simple cliche applies here: If it ain't broke, it don't need fixing.
I'm using Hello.js in my AngularJS application to let users authenticate with Facebook. Once the user is logged in and I get the user information from Facebook, I use the json and get the user object from my own database and store it in the root scope, so that various areas in my site can access the user's profile info (for example in the header where I display the logged in user's name).
Here's the service that handles the authentication stuff
angular.module('myApp')
.factory('userService', function($rootScope, $location) {
// Hello.js Functions
hello.init({
facebook : '1234567891234545'
});
var service = {
isLoggedIn: function() {
return $rootScope.loggedInUser != null;
},
login: function() {
hello('facebook').login( function() {
hello('facebook').api('/me').success(function(json) {
$rootScope.loggedInUser = getUserFromMyOwnAPI(json.id);
$rootScope.$apply(function() {
$location.path('/');
});
});
});
},
logout: function() {
hello('facebook').logout( function() {
$rootScope.loggedInUser = null;
$location.path('/');
});
},
loggedInUser: function(){
return $rootScope.loggedInUser;
}
}
return service;
})
The issue I'm having is that every time I refresh the page, I lose the profile info. Makes sense because $rootScope.loggedInUser which stores the user data (based on the json I got back from Facebook) would get reset after a page refresh.
How should I handle this? Should I be putting the user data in localStorage instead of the rootScope? Or should I somehow be leveraging hello('facebook').api('/me') each time I want to reference the user's profile info?
I noticed that hello.js already stores something in localStorage:
key: hello
{"facebook":{"state":"","access_token":"blahblahblah","expires_in":6776,"https":"1","client_id":"12345","network":"facebook","display":"none","redirect_uri":"http://adodson.com/hello.js/redirect.html","scope":"basic","expires":1412632794.806}}
So I 'm wondering if I would be duplicating the effort by adding the user object to localStorage.
The answer to this question is subjective and could be argued different ways. However, based on your question and comment, my opinion is that retaining (non-sensitive) user profile information in local storage would be a good option to provide an uninterrupted user experience.
Never store the user information including but not limited social network info inside the localStorage, cookie, and/or other unsecured mechanisms. Even if you need to make a compromise on a not so critical information, then use memory (like scope variables). You also have to account for complexities arising on storing user info in localStorage such as when user logs out of the social network. Now, session tracking is as old of WWW itself. How to track the sessions? There are countless articles discussing the pro and cons of Cookies, Json Web Tokens, Session State in server side, etc. My personal opinion is to store the most of the user info in server side and link the current user to that session using a session ID stored in any possible medium like Cookie, Query Param, localStorage etc. Thats only useful if you have a backend and your OAuth provides you a token. I dont see anywhere in hello.js that it provides you a token. So given that you shouldnt store any client side user info in browser, and hello.js doesnt provide you with a token to reuse in subsequent calls my advice to you is to login the user every single time.
about your code:
As Adin and DanArl already stated, you can implement the process of the user's session tracking in many different ways - preferably server side, identified via some kind of identifier stored in a session cookie.
Concerning your actual code, you may have a look at jshint:
Three warnings
10 Use '!==' to compare with 'null'.
31 Missing semicolon.
34 Missing semicolon.
Three undefined variables
1 angular
4 hello
13 hello
14 hello
23 hello
15 getUserFromMyOwnAPI
You should inject 'hello' correctly, by passing 'hello' to your 'userService'.
(have a look at: https://docs.angularjs.org/guide/di if you want more information about dependency injection in AngularJS)
about your actual problem:
After you have received the token - from your prefered way of storing it - you should be able to inspect and validate the token via:
GET graph.facebook.com/debug_token?
input_token={token-to-inspect}
&access_token={app-token-or-admin-token}
Source: https://developers.facebook.com/docs/facebook-login/manually-build-a-login-flow/v2.1#checktoken
Then you may have a shot at passing this information to helljs (i'm used to this library)
I need users to be able to post data from a single page browser application (SPA) to me, but I can't put server-side code on the host.
Is there a web service that I can use for this? I looked at Amazon SQS (simple queue service) but I can't call their REST APIs from within the browser due to cross origin policy.
I favour ease of development over robustness right now, so even just receiving an email would be fine. I'm not sure that the site is even going to catch on. If it does, then I'll develop a server-side component and move hosts.
Not only there are Web Services, but nowadays there are robust systems that provide a way to server-side some logic on your applications. They are called BaaS or Backend as a Service providers, usually to provide some backbone to your front end applications.
Although they have multiple uses, I'm going to list the most common in my opinion:
For mobile applications - Instead of having to learn an API for each device you code to, you can use an standard platform to store logic and data for your application.
For prototyping - If you want to create a slick application, but you don't want to code all the backend logic for the data -less dealing with all the operations and system administration that represents-, through a BaaS provider you only need good Front End skills to code the simplest CRUD applications you can imagine. Some BaaS even allow you to bind some Reduce algorithms to calls your perform to their API.
For web applications - When PaaS (Platform as a Service) came to town to ease the job for Backend End developers in order to avoid the hassle of System Administration and Operations, it was just logic that the same was going to happen to the Backend. There are many clones that showcase the real power of this strategy.
All of this is amazing, but I have yet to mention any of them. I'm going to list the ones that I know the most and have actually used in projects. There are probably many, but as far as I know, this one have satisfied most of my news, whether it's any of the previously ones mentioned.
Parse.com
Parse's most outstanding features target mobile devices; however, nowadays Parse contains an incredible amount of API's that allows you to use it as full feature backend service for Javascript, Android and even Windows 8 applications (Windows 8 SDK was introduced a few months ago this year).
How does a Parse code looks in Javascript?
Parse works through classes and objects (ain't that beautiful?), so you first create a specific class (can be done through Javascript, REST or even the Data Browser manager) and then you add objects to specific classes.
First, add up Parse as a script tag in javascript:
<script type="text/javascript" src="http://www.parsecdn.com/js/parse-1.1.15.min.js"></script>
Then, through a given Application ID and a Javascript Key, initialize Parse.
Parse.initialize("APPLICATION_ID", "JAVASCRIPT_KEY");
From there, it's all object manipulation
var Person = Parse.Object.extend("Person"); //Person is a class *cof* uppercase *cof*
var personObject = new Person();
personObject.save({name: "John"}, {
success: function(object) {
console.log("The object with the data "+ JSON.stringify(object) + " was saved successfully.");
},
error: function(model, error) {
console.log("There was an error! The following model and error object were provided by the Server");
console.log(model);
console.log(error);
}
});
What about authentication and security?
Parse has a User based authentication system, which pretty much allows you to store a base of users that can manipulate the data. If map the data with User information, you can ensure that only a given user can manipulate specific data. Plus, in the settings of your Parse application, you can specify that no clients are allowed to create classes, to ensure innecesary calls are performed.
Did you REALLY used in a web application?
Yes, it was my tool of choice for a medium fidelity prototype.
Firebase.com
Firebase's main feature is the ability to provide Real Time to your application without all the hassle. You don't need a MeteorJS server in order to bring Push Notifications to your software. If you know Javascript, you are half way through to bring Real Time magic to your users.
How does a Firebase looks in Javascript?
Firebase works in a REST fashion, and I think they do an amazing job structuring the Glory of REST. As a good example, look at the following Resource structure in Firebase:
https://SampleChat.firebaseIO-demo.com/users/fred/name/first
You don't need to be a rocket scientist to know that you are retrieve the first name of the user "Fred", giving there's at least one -usually there should be a UUID instead of a name, but hey, it's an example, give me a break-.
In order to start using Firebase, as with Parse, add up their CDN Javascript
<script type='text/javascript' src='https://cdn.firebase.com/v0/firebase.js'></script>
Now, create a reference object that will allow you to consume the Firebase API
var myRootRef = new Firebase('https://myprojectname.firebaseIO-demo.com/');
From there, you can create a bunch of neat applications.
var USERS_LOCATION = 'https://SampleChat.firebaseIO-demo.com/users';
var userId = "Fred"; // Username
var usersRef = new Firebase(USERS_LOCATION);
usersRef.child(userId).once('value', function(snapshot) {
var exists = (snapshot.val() !== null);
if (exists) {
console.log("Username "+userId+" is part of our database");
} else {
console.log("We have no register of the username "+userId);
}
});
What about authentication and security?
You are in luck! Firebase released their Security API about two weeks ago! I have yet to explore it, but I'm sure it fills most of the gaps that allowed random people to use your reference to their own purpose.
Did you REALLY used in a web application?
Eeehm... ok, no. I used it in a Chrome Extension! It's still in process but it's going to be a Real Time chat inside a Chrome Extension. Ain't that cool? Fine. I find it cool. Anyway, you can browse more awesome examples for Firebase in their examples page.
What's the magic of these services? If you read your Dependency Injection and Mock Object Testing, at some point you can completely replace all of those services for your own through a REST Web Service provider.
Since these services were created to be used inside any application, they are CORS ready. As stated before, I have successfully used both of them from multiple domains without any issue (I'm even trying to use Firebase in a Chrome Extension, and I'm sure I will succeed soon).
Both Parse and Firebase have Data Browser managers, which means that you can see the data you are manipulating through a simple web browser. As a final disclaimer, I have no relationship with any of those services other than the face that James Taplin (Firebase Co-founder) was amazing enough to lend me some Beta access to Firebase.
You actually CAN use SQS from the browser, even without CORS, as long as you only need the browser to send messages, not receive them. Warning: this is a kludge that would make my CS professors cry.
When you perform a GET request via javascript, the browser will always perform the request, however, you'll only get access to the response if it was from the same origin (protocol, host, port). This is your ticket to ride, since messages can be posted to an SQS queue with just a GET, and who really cares about the response anyways?
Assuming you're using jquery, your queue is https://sqs.us-east-1.amazonaws.com/71717171/myqueue, and allows anyone to post a message, the following will post a message with the body "HITHERE" to the queue:
$.ajax({
url: 'https://sqs.us-east-1.amazonaws.com/71717171/myqueue' +
'?Action=SendMessage' +
'&Version=2012-11-05' +
'&MessageBody=HITHERE'
})
The'll be an error in the console saying that the request failed, but the message will show up in the queue anyways.
Have you considered JSONP? That is one way of calling cross-domain scripts from javascript without running into the same origin policy. You're going to have to set up some script somewhere to send you the data, though. Javascript just isn't up to the task.
Depending in what kind of data you want to send, and what you're going to do with it, one way of solving it would be to post the data to a Google Spreadsheet using Ajax. It's a bit tricky to accomplish though.Here is another stackoverflow question about it.
If presentation isn't that important you can just have an embedded Google Spreadsheet Form.
What about mailto:youremail#goeshere.com ? ihihi
Meantime, you can turn on some free hostings like Altervista or Heroku or somenthing else like them .. so you can connect to their server , if i remember these free services allows servers p2p, so you can create a sort of personal web services and push ajax requests as well, obviously their servers are slow for free accounts, but i think it's enought if you do not have so much users traffic, else you should turn on some better VPS or Hosting or Cloud solution.
Maybe CouchDB can provide what you're after. IrisCouch provides free CouchDB instances. Lock it down so that users can't view documents and have a sensible validation function and you've got yourself an easy RESTful place to stick your data in.