What I want is to have a funnel reporting where users gave up a form, which fields they completed, and where did they go after they gave up.
Between my researches, I've found two ways of tracking this in Analytics. The first one is creating for each field a virtual page, triggering _trackPageView in completion, and making a goal with a funnel, that has the final page as a goal eg.: /form/open-studio/received, and the other fields as steps, for example: /form/open-studio/name, /form/open-studio/email (along with sending a form, /form/open-studio/send).
This solution was beautiful until I found out that this is inflating my page views report and decreasing the on-page time. I found in this article a solution that involves creating a filter excluding pages that are in a /form "subdirectory" and creating another profile just to track these fields. Although, managing two accounts in JS and everything about this solution just doesn't feel right.
My question about this approach is: Is there a way to create some filter (globally or just in the reports) that doesn't count these /form pages as real page views, but still works to track the funnel?
The second way I found was tracking field completion with events. This solution looks more natural and organic. I could make a goal with a category of events for each form, and each field is a different action. The problems with this approach are: Each event triggered accomplishes a goal (which is not what I want), and GA doesn't seem to be able to create funnels out of events. I may be wrong (and I wish I am), but even this looking like the right option, this funnel report is very important for the client.
Which of these two is the "right approach"? Can I make this work without screwing up the other reports or having to sell my soul to a javascript GA code mess?
Also, if there is any other option instead of choosing between hell and limbo, please, let me know.
Check out Heap Analytics. I think you will find it a better solution both for ease of implementation and ease of viewing results for what you're requesting. The process would be:
Create a Heap Analytics account. (Free 60 day trial.)
Add Heap Javascript snippet to your site. (Just copy and paste, you won't need to worry about implementing additional javascript on each field.)
Let it run for enough time to get a decent sample size (depends on the amount of traffic you get on your site).
Login to Heap Analytics and name the JS events on each page and each pageview.
Create a funnel from the named events to see where the problem is. You can also browse user paths on the Lists report and filtering by users who have done a particular action to see where they are going when they leave the form.
Related
I'm new to Asp.net and web development, I have an application that makes some calculation based on the selected dates on the form and gets some data from sql db, every time the user change the date the from should be updated with the latest calculations, means a post back, which a bit annoying, what I want is what is the best way / method / language to retrieve the data into my from without post back the form?, is it possible with JavaScript?, I tried the update panel, but still annoying and slow with big calculations, could you please give me an example to start with as I have the below stored procedure on my database, and let's say that I have a TextBox will get the returned Value of #category:
alter PROCEDURE Test2sp
#id_ref nvarchar(50),
#category nvarchar(50) output
as
BEGIN
select #category= category from employees where id_ref=#id_ref
END
GO
Well, software is part science, but part art so to speak.
So, in software, RARE that "one solution" is the best.
So, in your question the goal is to avoid a post-back.
But, you should supply more information as to why that post-back is to be avoided.
means a post back, which a bit annoying,
Hum, ok, not really a solid narrative here.
The issue is we have quite a few choices. And thus which road and choice will depend.
For example, OFTEN we can as a quick and easy soluion?
We can introduce what is called a update-panel.
They are easy to setup, and they reduce efforts (developer cost's) on your part.
However, if you abuse use of up-date panels, then you wind up not saving performance, but might save developer cost and time!!
We all don't have un-limited time or unlimited government budgets for our software choices.
So, update panels are OFTEN lamented by many in asp.net land, but I happen to be a great fan of them. In fact, I find them downright amazing!!!
The reason for using a update panel (UP in this disscuion) are you eliminate the need to write client side browser code, get to use nice code behind, and in MOST cases are a viable choice and soluion.
So, what a UP does is allow you to update only parts of the browser (just in case everyone is wondering - razor pages attempt the same goal, but do this in a different way).
You don't mention/note when you want this calualation to occur. For example, are you wanting to use the control/text changed event?
Or, does user change the value, and have to hit some button?
Anway, we assume now that you have the code behind that calls and gets the information you want working.
Since you spent that time and money and effort to build that code?
Then give a UP a try!!
Just enclose that date and button inside of a UP, and you get what is called a "partial page" post back. That means that ONLY that part of the page will be sent to the server. And this means no flicker on the page, no scroling postion changes. In fact, quite much NEAR ALL of what you acheive with a ajax call will be the result.
But, keep in mind that while this looks and talks and quacks like a ajax call?
The page life cycle DOES run. That certanly means that page load will run. But, if the page been designed that you have the
if (!IsPostBack)
{
// the REAL first page load code goes here
// code to load drop downs, combo's, pick lists etc.
// code to load data and grids etc.
}
So, we design all pages with above in mind, since no need to re-load data on each post-back and round trip, right????
So, using a UP
Drag in a script manager (put it right below the page form tag.
Then take your date field/control (whatever) and the button you NOW have to run code behind to get that data.
The controls you need to operate on (update) need to be included inside of this update panel. (don't put more of the page into the UP then you need).
the pattern lookes like this:
<asp:UpdatePanel ID="UpdatePanel1" runat="server">
<ContentTemplate>
// put markup of button and controls here
</ContentTemplate>
</asp:UpdatePanel>
So, in above, put your button + date control in above. Try running it.
You not see a page post back, you are free to use code behind. And you not see any page flicker. it really is like magic.
Behind the scenes, you get a ajax setup, but without all the hassle.
Do give the above a try - you find the results amazing and beautifal - and effort less, since you get to use + write 100% server side code.
The next choice?
Create a static method in the current web page, and then use a ajax call.
This approach is about the best choice, but it more efforts, and you don't in code behind have any use of controls in the current web page (so the ajax post call has to pass any values required from the web page. and on return of that ajax call, then the client side js code has to update the controls on the page.
So best choice? Well, a ajax call in "somewhat" better, but then again, it reuqires far more efforts (and thus more cost).
Try the update panel. You be rather amazed, since it oh so easy to remove a post-back on given webform pages.
However, do keep in mind, that a page life cycle STILL occurs, but it only now the part you placed in that UP.
I'm working on a vue app that uses vuex and gets objects from an api. The tables have paging and fetch batches of objects from the api, sometimes including related entities as nested objects. The UI allows some editing via inputs in a table, and adds via modals.
When the user wants to save all changes, I have a problem: how do I know what to patch via the api?
Idea 1: capture every change on every input and mark the object being edited as dirty
Idea 2: make a deep copy of the data after the fetch, and do a deep comparison to find out what's dirty
Idea 3: this is my question: please tell me that idea 3 exists and it's better than 1 or 2!
If the answer isn't idea 3, I'm really hoping it's not idea 1. There are so many inputs to attach change handlers to, and if the user edits something, then re-edits back to its original value, I'll have marked something dirty that really isn't.
The deep copy / deep compare at least isolates the problem to two places in code, but my sense is that there must be a better way. If this is the answer (also hoping not), do I build the deep copy / deep compare myself, or is there a package for it?
It looks like you have the final state on the UI and want to persist it on the server. Instead of sending over the delta - I would just send over the full final state and overwrite whatever there was on server side
So if you have user settings - instead of sending what settings were toggled - just send over the "this is what the new set of settings is"
Heavy stuff needs to be done on the server rather than the client most of the time. So I'll follow the answer given by Asad. You're not supposed to make huge objects diffs, it's 2022 so we need to think about performance.
Of course, it also depends of your app, what this is all about. Maybe your API guy is opposed to it for a specific reason (not only related to performance). Setup a meeting with your team/PO and check what is feasible.
You can always make something on your side too, looping on all inputs should be feasible without manually doing that yourself.
TLDR: this needs to be a discussion in your company with your very specific constrains/limitations. All "reasonable solutions" are already listed and you will probably not be able to go further because those kind of "opinion based" questions are not allowed anyway on SO.
I am in the middle of the design/development of a web store and am thinking my way through the best way of handling a transparent load of a couple of megabytes of product items. It seems the Asynchronous bit of AJAX doesn't mean parallel so I have to be a little bit creative here.
Rather than just pull a large lump of data down I was thinking of breaking it into pages of say 50->100 items and allowing the browser some time to process any internal messages.
The loader would pull down a page of data - fire a custom event to itself to get the next page. Theory is that if the browser has other messages to process this event would queue up behind them allowing the browser do anything else it has to do. A loss of a bit of speed - but a smoother user experience.
Rinse and repeat.
Add in some smoke and mirrors engineering - a loading icon or some such - to keep the user from noticing any delays and I should be right.
Before I dive into what is starting to sound like a fun bit of code can anyone think of a better way to pull down a large lump of data in as smooth and friendly a way as possible? I am an ancient old programmer - but JavaScript is a bit new to me.
Am I reinventing the wheel - AJAX already does all this - and I just don't know about it?
There are two ways to improve the situation:
a) reduce the data coming from the database - i.e. if there is some information, which is not used you don't need to load it. Also if there is non-changeable data you may cache it and request it only in the beginning once
b) load only the information which you need to show - that's the way which you thinking about, except the fact that you want to trigger new data loading automatically. Or at least that's what I understood. I'll suggest to keep the ajax requests as less as possible and make a new one only if the user needs more data. For example if the user stays on page 1 of 20, you don't need to fire loading of page 3 and 4. It's maybe good idea to load page 2, so the user could switch fast.
Disclaimer: I am a designer with little jQuery knowledge. I will work with a JavaScript developer to realise this, but need to be able to write a detailed specification of my requirement.
I am looking to develop a platform which will track user behaviour via jQuery events API. Events, such as whether a user ‘points-and-clicks’ or tab’s to proceeding form fields will be scored, e.g. ‘point-and-click’ gets a -1 and tab gets a 2.
What is the best way of keeping track of all these event scores? My initial thought is an integer variable which increases and decreases in number depending upon user behaviour. The platform must also be able to:
be developed into a cookie
the value of the variable (if this is the best method) will inform which content to deliver to the viewstate via AJAX
I am happy to receive code snippets and/or written specification suggestions on how best to handle this.
Well, if your need is to be able to write a detailed specification of your requirements then copy/paste your question. It's ok like this.
Or make it more clear by removing your questionnings. If you're writing required specs, don't try to answer on how it should be done, just ask for what has to be done.
Track user behaviour via jQuery events API. Events, such as whether a
user ‘points-and-clicks’ or tab’s to proceeding form
fields.
You can add extra 'project management' infos like
date required
accessibility scope
some web development nice to write for a designer but useless for a developer stuff
budget
data chart / data flow
inputs/outputs
I've been getting more and more into high-level application development with JavaScript/jQuery. I've been trying to learn more about the JavaScript language and dive into some of the more advanced features. I was just reading an article on memory leaks when i read this section of the article.
JavaScript is a garbage collected language, meaning that memory is allocated to objects upon their creation and reclaimed by the browser when there are no more references to them. While there is nothing wrong with JavaScript's garbage collection mechanism, it is at odds with the way some browsers handle the allocation and recovery of memory for DOM objects.
This got me thinking about some of my coding habits. For some time now I have been very focused on minimizing the number of requests I send to the server, which I feel is just a good practice. But I'm wondering if sometimes I don't go too far. I am very unaware of any kind of efficiency issues/bottlenecks that come with the JavaScript language.
Example
I recently built an impound management application for a towing company. I used the jQuery UI dialog widget and populated a datagrid with specific ticket data. Now, this sounds very simple at the surface... but their is a LOT of data being passed around here.
(and now for the question... drumroll please...)
I'm wondering what the pros/cons are for each of the following options.
1) Make only one request for a given ticket and store it permanently in the DOM. Simply showing/hiding the modal window, this means only one request is sent out per ticket.
2) Make a request every time a ticket is open and destroy it when it's closed.
My natural inclination was to store the tickets in the DOM - but i'm concerned that this will eventually start to hog a ton of memory if the application goes a long time without being reset (which it will be).
I'm really just looking for pros/cons for both of those two options (or something neat I haven't even heard of =P).
The solution here depends on the specifics of your problem, as the 'right' answer will vary based on length of time the page is left open, size of DOM elements, and request latency. Here are a few more things to consider:
Keep only the newest n items in the cache. This works well if you are only likely to redisplay items in a short period of time.
Store the data for each element instead of the DOM element, and reconstruct the DOM on each display.
Use HTML5 Storage to store the data instead of DOM or variable storage. This has the added advantage that data can be stored across page requests.
Any caching strategy will need to consider when to invalidate the cache and re-request updated data. Depending on your strategy, you will need to handle conflicts that result from multiple editors.
The best way is to get started using the simplest method, and add complexity to improve speed only where necessary.
The third path would be to store the data associated with a ticket in JS, and create and destroy DOM nodes as the modal window is summoned/dismissed (jQuery templates might be a natural solution here.)
That said, the primary reason you avoid network traffic seems to be user experience (the network is slower than RAM, always). But that experience might not actually be degraded by making a request every time, if it's something the user intuits involves loading data.
I would say number 2 would be best. Because that way if the ticket changes after you open it, that change will appear the second time the ticket is opened.
One important factor in the number of redraws/reflows that are triggered for DOM manipulation. It's much more efficient to build up your content changes and insert them in one go than do do it incrementally, since each increment causes a redraw/reflow.
See: http://www.youtube.com/watch?v=AKZ2fj8155I to better understand this.