My company uses datylon (third party plug-in) to create graphs in illustrator for reports that are replicated anywhere from 25-350 times. I have figured out a way to automate the majority of the process, but the one thing I cannot figure out is how to link the data automatically to the graph template file. I have figured a way to link an excel file to the template, but I cannot figure out how to call the button action in javacript. Basically, this button allows you to connect a new data sheet to the file and it automatically updates the data within the graphs. Is there a way to find this interaction somehow? Or any creative way to press this button? It's the last step in making this whole process completely automated. Any help would be much appreciated!
I have tried contacting the company to find a call for the button, but they are very reluctant to help because they have an upsell to allow you to store your data on their servers and update the graphs. However, because of the secure nature of the data we handle, we cannot use their servers to store our data.
Related
I'm a total stranger to JavaScript and very new to Python. Please pardon my ignorance if any!
I have an internal website which allows me to download release reports. I have an idea to create a userscript to download the release report in a CSV.
This is how the flow is
User lands on a release report page
Makes a selection and clicks on a button
This shows release data in raw format (which is extremely crude and not user friendly). A download JSON button is present on the page which can let you download the data in JSON format
I observed on step 2 an API call is made with a release ID which is of course dynamic.
My plan is to
show a button download CSV when Download JSON button is available - I have completed this
Intercept the GET request whose request URL is dynamic and fetch the JSON
Flatten this JSON and create a CSV with whatever I need - I have this right now with python.
My questions are -
Is it possible to fetch network request headers - since the URL will be dynamic I have to rely on the last call the browser made? I know the format but a regex request URL will not work (I realised later how obviously silly that was)
My JSON normalising code is in python. Since the JSON itself is very dynamic and complex I used Pandas but I have no idea if I can even invoke it through a JavaScript.
Is there a better way to do this using just Python maybe? Maybe I shouldn't rely on a userscript, is there an alternative to work with a browser for this specific task?
I'm looking for any direction or even a possible different approach.
Many thanks.
I can't seem to get 'Requests' made by the browser. I'm exploring chrome.webrequests for this but not sure if I can integrate all these things together. I'm also thinking if selenium could ease any of these tasks.
The project I'm working on involves financial data, my issue specifically pertains to chart visualizations with d3.js. So my back-end is in PHP using Laravel and my front-end will request data through an AJAX call to my back-end and it will send data to the client in the response.
I then use d3.js to visualize the data in various graphs. Because this is async and uses async GET requests, I don't have any specific URL path to navigate to in order to 'save' a state. Navigating away from this page will cause a reset for anything a user might have entered.
I am wondering if there is a way to easily integrate some sort of method for saving a chart's state. I am thinking maybe cookies could be something to look into? The user can add various technical indicators such as moving averages and there is technically no limit to how many indicators they would like to show. As a result, a user can concoct some complex charts--all of which would be reset anytime a user wants to refresh the page or in the off-chance they accidentally hit the back button.
I already have user registration on the website so I could also consider just saving chart states on my end, as well, and fetching them when a user navigates to the charts page. I have access to both a MongoDB database and a MySQL database.
I've never done anything like this before since this is my first time I'm working with a deployed app and the first time I've really needed to consider something like this.
Multiple charts also are depicted on this page, if that makes any difference.
I have two Bokeh apps (on Ubuntu \ Supervisor \ Nginx), one that's a dashboard containing a Google map and another that's an account search tool. I'd like to be able to click a point in the Google map (representing a customer) and have the account search tool open with info from the the point.
My problem is that I don't know how to get the data from A to B in the current framework. My ideas at the moment:
Have an event handler for the click and have it both save a cookie and open the account web page. Then, have some sort of js that can read the cookie and load the account.
Throw my hands up, try to put both apps together and just find a way to pass it in the back end.
The cookies idea might work fine. There are a few other possibilities for sharing data:
a database (e.g. redis or something else, that can trigger async events that the app can respond to)
direct communication between the apps (e.g. with zeromq or similiar) The Dask dashboard uses this kind of communication between remote workers and a bokeh server.
files and timestamp monitoring if there is a shared filesystem (not great, but sometimes workable in very simple cases)
Alternatively if you can run both apps on the same single server (even though they are separate apps) then you could probably communicate by updating some mutable object in a module that both apps import. But this would not work in a scale-out scenario with more than one Bokeh server running.
Any/all of these somewhat advanced usages, an working example would make a great contribution for the docs so that others can use them to learn from.
We're looking to make a little webapp to manage our week-long nerf war (humans vs zombies to be precise), and we're thinking about how easy it would be to have Google Sheets be our only backend, and our frontend be entirely javascript/html/css.
Let's say there's two actions that can be done in this javascript:
Register, which adds a row to a certain sheet.
Report tag, which adds a row to another sheet.
Let's say we have 100 players. We'll have each player sign in using a google account. Is there a way that for either of those above actions, we can have sheets know who made that action?
This way, if someone gets hold of the API key and spoofs their referer to make bad requests, then we can know which google account did it and ban them from the game.
For example, if I open up my sheet and say "see revision history", I want to not see one user for all the revisions, I want to see the user who triggered the action.
Is this a reasonable approach, and is it possible? Thanks!
(note: i know these two actions can be done via google forms, which can associate the user's account, but imagine we have more complex actions that cant be achieved with just a google form)
The short answer is no. You'll be using the spreadsheets API (NOT the Drive API) to update the sheet.As far as Google is concerned, the "user" is your application, regardless of which human was driving the application at the time. Your application knows who the human is, and so it is responsible for logging any audit info that your use case may require.
I have a php application with MySQL for database.
I want to create a site builder. The idea is to use ajax for loading and storing the dynamic page content. The user will be able to modify the site (create and edit pages, navigation menus, etc.) while viewing the front-end. The changes will be presented in real time, and then committed once saved.
I'm not sure what would be the best method for manipulating and storing the dynamic page content.
Should I just change the DOM, and then save its current state somehow? Or, would it be better to use an object for storing the page's content and structure? Would it be better to store the pages in SQL, or file?
EDIT
So, what I decided on, if it helps anyone (and thanks to all who responded!):
I have created jQuery functions which allow the user to manipulate the DOM simply by clicking an element on the page, and then adding content to a new element (so for example I have a text and image insert tool). I'm using a handler object to track changes, which are then applied to the DOM once the user clicks update button.
Once the user saves the page, I am using ajax to save a portion of the DOM to a MySQL database.
Then I have a pagebuilder function created which calls my custom theme's header, pulls the html from the database, and then calls the theme's footer.
So far this is working very well. The pagebuilder takes care of constructing the page by using the url's ?page=x reference. This still allows my core app and theme system to control each page's header and footer, while still allowing for an easy way to edit, save and retrieve the content, all using ajax.
Based on my experience with Magento and your dynamic things, i think it'll better to save the dynamic content on database, but saving things like:
"Home"
"3 columns"
"Input - Text"
...
And when you get it back, you use the "directions" saved on database, to build your dynamic website.
I would say that it is a matter of architecture here. Storing in a Database will provide you better performance retrieving and storing long data streams, and will probably a better data organization along the way.
The question is -how can you store a Web site into a Database efficiently?-
Has a partner say before there are many ways of skin the cat. How complex do you want to go? are you going to store single pages? images? tables? full websites? will your users be able to store raw/other data too?
You see?
Hope it helps.