This is the first time i am using REST for any web applications.
For normal get an post and i simply call the API done in Django Rest Framework.
But i am not able to think how can i deal with situations where something more needs to be done.
Suppose I have
List of users in database and their product they have bought.
Now i have web form where if someone adds the user and then submit the button , then
I have to get the list of items bought by that user in 5 hour window
Update the row in database which says buy_succeessful to false
Then again get the list of orders from the items he has bought and then update the rows with order_successful to false
Now current in my submit actions i am doing like
call to api to add the user in override manual enrty table. This is simple post to that table
Then after getting the sucessful tehn i again call api to list of items this user has bought using Query parameters . Then i have the list
Then again i loop through the list and post to api for updating that record in datbase
and so on
I am feeling this is not right.
I have found that quite often there are some more things to do tahn just saving individual objects in database.
whats the best way to do that. DO i need to have view api for every function
Try the 3rd step of the DRF Tutorial:
http://www.django-rest-framework.org/tutorial/3-class-based-views
Here, it shows how to do a "PUT" request for updating data. And also some of the other DRF features.
Also, you can reference serializer.object which is the object instance of the django model record that you are saving to the database. This question here talks about adding extra attributes, etc... before saving to the database:
Editing django-rest-framework serializer object before save
You can also access the record post_save and there are other hooks in the framework that you can use.
Related
Many of the tutorials for survey / forms in React tend to cover front-end mechanics only. In my case, I pretty much have the front-end where I want it but have come to realize I know virtually nothing about back-end programming.
I thought it best to take baby steps, so maybe adding a line to a json/tsv after the user clicks a button would be a reasonable goal. I'm imagining the user manipulates all the bells and whistles I have then once he/she clicks "submit" then a new row is added to a "master_data.tsv" file on the back end.
Just for illustration, the portion of the state I would like to save is:
state = {
selectBoxes: [
{id:1, strategies:['Strat1','Strat2', 'Strat3','Strat4','Strat5']},
]
For context, this state gets passed down to drop-down menu components that have event listeners to record the user's choice. I have it so that the state is updated to reflect the user's desired choice. But I have not figured out how to dump the data on the back end once the choice is selected and the user click's "submit."
Question
Assuming click-flow:
Toggle dropdown menu -> choose item -> click "submit" button
How would I add a new row to master_data.tsv after each "submit" event?
(can ignore unique user qualification and all the fancy stuff, maybe we can settle for each new row has an id though. )
I would recommend taking a step back and first, thinking about the actual flow and data persistence of your application.
I would recommend creating a backend server (any language) that offers you to post the data to it via an API endpoint (usually a REST API with a POST endpoint)
After you receive your data, you have to persist it. Either in a database, in a session or on disk.
The last step is to retrieve the data in the desired format (tsv).
Either create another endpoint to return the data, or return the entire file already on POST.
Here is an example flow of how it could look like
Front-End: Send data on submitting to the backend (POST /entries)
Backend: Receive data and store it (disk, database …)
Front-End: Receive data from backend (GET /entries)
Backend: Returns entries as tsv
This way you are rather flexible and decoupled. Later on, you could exchange the format easily to JSON, XML, CSV …
Your source of truth should always be your storage layer (database, file on disk)
Lets say i have a form that i use to edit a Customer. In addition to various input fields it will also have multiple drop down lists to set some fields (eg. Country, Category, Status...). Every drop down list will need a seperate lists that i need to get from the backend to populate.
That means that if i want to edit a customer with my form, i need to load:
The Customer Object which will be edited
A list of countries
A list of categories
A list of different stattus types
...
My question is:
Should each of these things be loaded seperatly with its own backend API call, or should i write a API backend call that will combine all these things into a single object and use it to load my data?
I think It's better use multiple API call in almost situation.
After compare pros and cons as table below, I always choose Multiple API call for projects.
Credit to Andrew Corrigan and Amrit remind me some criterias.
Single API
Multiple API
Network
Less request
Multiple request => Caching
UI Render
Render data in nearly same time
Render if any api response
Reuse FE component
Need to call big API to take one array data
Get what needed
Reusability API
Low
High
Single Responsibility
No
Yes
Flexible
No
Yes
Its an opinion based and scenario based , but in my suggestion i would prefer
things be loaded separately with its own backend API because :
1.Single api will be heavy and UX will be badly impacted
2.User may not change all field when form opens so only changing fields will be using api
I've developed a web application with the concept of Single Page Application but none of the modern techs and frameworks.
So I have a jQuery page that dynamically requests data to localhost - a Laravel instance that compiles the entries in the DB (within a given time interval).
So the client wants to see all the entries for last week, the app works fine. But if he wants to see the results for the whole last month... well, they're so many that the default execution time of the php ins't enough to process all the data (30 seconds). I can easily override this, of course, but then the jQuery client will loop through these arrays of objects and do stuff with them (sort, find, sum...). So I'm not even sure jQuery can handle this many data.
So my question can be broken in two:
Can laravel ->paginate() be used so the ajax request of jQuery can also chunk the data? How does this work (hopefully in a manner that doesn't force me to rewrite all the code).
How could I store large amounts of information on the client? It's only temporary but the users will hang around for a considerable amount of time on my webpage, and I don't want them to wait 5 minutes every time they press a button
Thanks.
If you want to provide an interface to a large amount of data stored in a backed, you should paginate the data. This is a standard approach, so I'm sure your client will be ok with that.
Using pagination is pretty simple - see the docs for Laravel 5.0 here: http://laravel.com/docs/5.0/pagination
In order to paginate results in the backend, you need to call paginate($perPage) on your query instead of get() in your controller, like that:
$users = User::whereIsActive(true)->paginate(15);
This will return paginated result with 15 records per page. Page number will be taken from page parameter of the request. In order to get 3rd page of users, you'll need your frontend jQuery app to send a request to URL like:
/users?page=3
I don't recommend caching data in the frontend application. The data can be changed by some other user and you won't even know about it. And with pagination, your requests should be lightweight enough to stop worrying about a request sent to fetch every page of results.
Not sure if you're subscribed to laracasts but Jeffery Way is amazing in explaining features of Laravel and I highly recommend his videos.
In short you can paginate the results, then on the view when you call the foreach on your items you can array_chunk() the results to display them how you need to. But the paginated results are going to be fetched using a query in the URL, and i'm not sure that is what you want if you're already using a lot of jQuery to keep everything on the same page.
https://laracasts.com/lessons/crazy-simple-pagination
But assuming you're already paginating the results with whatever jQuery you've already written for the json data...
You could also use a query scope to get the data you need to for the amount of time to scope to create a simple api to use with ajax. I think that's probably what you're looking for.
So here's what I would do assuming you're already doing some pagination manually with your javascript.
Create a few query scopes to filter the data for different lengths of time
Create simple routes to fetch results from URI using the query scopes
Get the json data from the route preforming an ajax requests to the URIs created
More information on Query Scopes: http://laravel.com/docs/5.1/eloquent#query-scopes
I currently have a spring mvc app that gets a list of users from a database and displays their information in a table using JSP to basically loop through each object in the list and create a table row for them.
Each user has an expiry date attribute as part of their record in the database. What I want to achieve is basically a button that when toggled shows or hides all users that have expired (i.e. their expiry date is less than today's date).
For this I am trying to use AJAX calls to my controller to fetch me all users expired or not OR only users that haven't expired depending on how the button is toggled.
What I would like help on is the best way to achieve this as I can think of a few nasty ways of doing it like having a separate page and refreshing but I am confused on a few things.
Should I just ditch the JSP looping through to make the table and make a method in JavaScript that creates that table when given the data? If so how do I get the data from the controller to JavaScript, can an AJAX call to a controller return me a list of my user objects?
My best guess is that instead of adding a list of objects to a model and letting JSP do the work, that I instead return a JSON with the data and use JavaScript to build the table. I can then call an update method to re-build the table.
You are correct. You have 2 options:
Have AJAX call return html (i.e. jsp) for the table and then replace
the body of the table
Use JavaScript to build the table and then
update the table with AJAX call which returns JSON.
If you want to get more sophisticated, you could use a JavaScript framework like Knockout.js which would let you mark up the table and refresh the table without too much of JavaScript writing.
Blurgh I'm not sure why this question has received so much attention, especially now in the days of angular but if you are struggling with this then I would strongly recommend the following library:
https://www.ag-grid.com/
What I did for the CRUD in my app is that I select all the item from backend and load it to the front-end and loop the item out using js, to be specified I used ajax.
Think of my app is a todo list. Even if a user inserted a new item, I suppose still need to select all the items from db again after insert query right? same goes to delete, I may use remove() but still need to load so that my item id doesn't mess up. correct?
I using angularjs ng-repeat, I cant do like id++, then I bind the the id in ng-repeat with the object that I got from json form db.
if I have 1 thousand of item that will cause problem because I trigger the load function too much in backend, how to solve that?
Loading all the items from back end is invitation for disaster. It will kill back end and front end both. It becomes a serious usability problem if you dump 1000's of rows of data in the UI. How will the user wade through the data and act on them? Provide some way to filter the items. For example - if it is a todo list display one day at a time (default being today). For any other use case we can provide similar filtering mechanism. That way you query limited data from back end, take it to the UI and display it. If you cannot filter like this at least provide some way of pagination to limit the data you query and transport to the UI.