How to feed read-only data from ClickUp API to Public Website - javascript

I'm not an experienced web developer so I'm having some issues connecting some dots with regards to ClickUp API (In my case) but it can be probably be more generalized as well.
I've been doing a bit of research, but can't seem to get exactly what I'm looking for which makes me think I'm missing something.
In my case, the goal is to make certian information on ClickUp publicly VEIWABLE (not writable) and format the html/css ourselves. This would be veiwable to strangers without requiring authentication with ClickUp in any way.
ClickUp provides public links and embedd options:
My attempt(JS): postMessage(content, url) and document.getElementById(frameID)
Issue: These will fail in browsers that enable that cross-origin protection thing. Forcing strangers to disable this security is not viable for us. I've also tried alloworigin, anyorigin, etc with no luck.
ClickUp API 2.0 for POST, GET, etc
Attempt: I wanted to use this unofficial JS wrapper https://www.npmjs.com/package/clickup.js but didn't write anything out yet.
Issues: From my understanding, this requires the stranger to authenticate with ClickUp before being able to read information. This would require server-side code.
Other issues: This read-only page is intended to be hosted on github which doesn't support server-side calls (from my understanding) so the solution I'm looking for may not exist.
The most straight-forward way I can think to solve this is the parsing embedded iframe method but the cross-origin problems kills it dead in the water. At this point, I'm stuck and would appreciate any additional feedback.
To formalize my question:
What architecture/pieces would I need to connect a front-end web page with private ClickUp information in a read-only fashion that allows front-end HTML/CSS modifications without user authentication?
My web development knowledge doesn't really go beyond javascript and maybe a tiny bit of PHP so if there's other tools available for this please let me know and I'll look into them.
Thanks for the time

Related

JSON in WP with GET/POST method (security problems)

So I'm working on some mobile application that will represent WordPress page.
Working it like web page, then spreading it to all mobile platforms using phonegap (mainly just taking webkit, moving all his functionalities, and putting that small webpage to applications)
Web is written in jQuery/JavaScript/html/css.
I come to some security problems. All data on original WordPress site is secret, and only members can see stuff from there.
And I need to get data in JSON format.
And there is my problem. :)
I can install (i did install) JSON api, and i can get every bit of data I want. But the problem is that anyone can (just type some "get" stuff in URL (for example: mysite.com/json/get_post/?id=1)).
I need to "secure" that data, and let it go only if the user who is asking for it is logged in.
What is the best solution?
I know there is a lot of security problems in this stuff. And only some encryption would be useful. But I need quick and easy solution that will at least make it harder than just typing url. :)
I found something about oAuth, but didn't really understand the way to use it. Any ideas? Any WordPress plugin? Anything?
Thank you. :)
What you are describing is an OWASP A4 - Insecure Direct Object Refernce vulnerability. Encryption does not solve this problem and is by no means the right tool. The problem is use access control. The user needs to login to the wordpress install in order to gain this type of information. Wordpress has a session system built in and you'll have to read up on their API documentation.

Pure Javascript and HTML app & deployment via CDN ... good idea?

A big and general question, though NOT a discussion
Me and a friend are discussing a web application being developed. Currently it uses PHP, but the PHP doesn't store anything and it is all OAuth based. The whole thing talks to an independent API. The PHP is really just mirroring alot of the Javascript logic for browsers without Javascript support.
If it were decided to enforce Javascript as a requirement (let's not go into that ... whole other issue)
Are there any technical, fundamental problems serving the app as HTML+Javascript hosted on a CDN? That is, 100% Static javascript and HTML with no server-side logic. As the Javascript is just as capable of doing all the API calls as the PHP. Are any existing sites doing this?
We can't think of any show-stoppers, but it seems like a scary thought to make a "web" app 100% client side ... so looking for more input.
(To clarify, the question is about deploying using ONLY javascript and HTML and abandoning server-side processing outside the JSON API or whatever)
Thanks in advance!
One issue is with search engines.
Search engine crawlers index the raw HTML source code of a web-page. If you use JavaScript to load new data and generate new content, crawlers won't come into play, so your content won't get indexed.
However, Google is offering a solution for this - read here: http://code.google.com/web/ajaxcrawling/
Other than this, I can't think of any other issue...
Amazon has been offering the service on its S3 for a little while now. http://aws.typepad.com/aws/2011/02/host-your-static-website-on-amazon-s3.html . Essentially this allows you to specify a default index page and error pages. Otherwise you just load up your html on the S3 and point your www CNAME on your domain to the Amazon S3 bucket or cloudfront CDN.
The only thing that is not possible this way is if a user ends up typing example.com instead of www.example.com, you need to ensure that you have your DNS correctly forward these to www. Also the S3 will not be able to handle a naked domain (http://example.com/).
Regarding how good an idea it is, it sounds good to us as well. And we are currently exploring the option. So far it appears to work fine. What we have done is to setup beta.example.com to point to a CDN hosted site (S3) and are testing to see if it gives us everything we need. Performance is great though !

Hiding unique API keys in webOS (Enyo/JavaScript)

How do I hide my private API keys in/for my webOS - Enyo based apps?
My development has basically come to a halt because of this issue.
Since webOS Enyo (as well as Mojo) is coded in Javascript, any user can plug their device in and easily view my source code. So obviously I can't just stick my keys in there. Even if they are encrypted, my app would have to include the mechanism for decrypting them to make any use of them. I'm looking to hide my private web service API keys (mainly OAuth Twitter, Facebook, Google, etc.) and maybe my AWS private keys.
So far the answers I found have stated that you can't secure anything like a private API key in Javascript. But all of those discussions have been dealing with web applications which have easy alternatives to using Javascript. webOS apps don't really have a pretty alternative to coding simple apps in Javascript.
The only path I see possible is to create a proxy that all of my API calls would pass through. Is that the only feasible or ideal option? If it is, would node.js do the trick for me here?
Any leads, resources, examples, tips, etc. would be greatly appreciated. I feel like the answer should be starring me in the face since so many apps connect to these services nowadays, but I have had no leads. Thanks.
No application of any kind that has client-side private keys like this (other than one that is entirely server-wide) is safe from prying eyes. This is true of a compiled C++ app for Windows too. If the application is going to use the private keys directly, then they're in the code or available to the code. Prying eyes can find them. Yes, Javascript might make the code a little more accessible, but this not a problem that's new to webOS or Javascript apps. If Enyo was a PC/Mac cross platform tool, wouldn't you have the same issue with your Twitter keys?
Usually, what is done is the keys are put in some sort of storage mechanism at install time. On a PC, that might be the registry or some config file. Does webOS have an install mechanism? It looks like they have HTML5-type storage - can you store them in there at install time. They won't be hack-proof (nor would they be on any other platform), but they also won't be lying in your Javascript code either.
The other solution to this is to require your developers to get their own keys to public services like Twitter rather than everyone using your own. That keeps you from risking your whole platform when there's one bad customer.
If I've misunderstood your situation, feel free to clarify and help me understand better.
My feeling is that having a proxy is a great idea. The proxy gives you additional benefits of adding user authentication and other functionality without changing the client side.
Take a look at the Key Manager service, which you can use to store your keys without having to code them into your JS files.

How does disqus work?

Does anyone know how disqus works?
It manages comments on a blog, but the comments are all held on third-party site. Seems like a neat use of cross-site communication.
The general pattern used is JSONP
Its actually implemented in a fairly sophisticated way (at least on the jQuery site) ... they defer the loading of the disqus.js and thread.js files until the user scrolls to the comment section.
The thread.js file contains json content for the comments, which are rendered into the page after its loaded.
You have three options when adding Disqus commenting to a site:
Use one of the many integrated solutions (WordPress, Blogger, Tumblr, etc. are supported)
Use the universal JavaScript code
Write your own code to communicate with the Disqus API
The main advantage of the integrated solutions is that they're easy to set up. In the case of WordPress, for example, it's as easy as activating a plug-in.
Having the ability to communicate with the API directly is very useful, and offers two advantages over the other options. First, it gives you as the developer complete control over the markup. Secondly, you're able to process comments server-side, which may be preferable.
Looks like that using easyXDM library, which uses the best available way for current browser to communicate with other site.
Quoting Anton Kovalyov's (former engineer at Disqus) answer to the same question on a different site that was really helpful to me:
Disqus is a third-party JavaScript application that runs in your browser and injects itself on publishers' websites. These publishers need to install a small snippet of JavaScript code that makes the first request to our servers and loads initial JavaScript loader. This loader then creates all necessary iframe elements, gets the data from our servers, renders templates and injects the result into some element on the page.
As you can probably guess there are quite a few different technologies supporting what seems like a simple operation. On the back-end you have to run and scale a gigantic web application that serves millions of requests (mostly read). We use Python, Django, PostgreSQL and Redis (for our realtime service).
On the front-end you have to minimize your payload, make sure your app is super fast and that it doesn't break in extremely hostile environments (you will be surprised how screwed up publisher websites can be). Cross-domain communication—ability to send messages from hosting website to your servers—can be tricky as well.
Unfortunately, it is impossible to explain how everything works in a comment on Quora, or even in an article. So if you're interested in the back-end side of Disqus just learn how to write, run and operate highly-scalable websites and you'll be golden. And if you're interested in the front-end side, Ben Vinegar and myself (both front-end engineers at Disqus) wrote a book on the topic called Third-party JavaScript (http://thirdpartyjs.com/).
I'm planning to read the book he mentioned, I guess it will be quite helpful.
Here's also a link to the official answer to this question on the Disqus site.
short answer? AJAX, you get your own url eg "site.com/?comments=ID" included via javascript... but with real time updates like that you would need a polling server.
I think they keep the content on their site and your site will only send & receive the data to/from disqus. Now I wonder what happens if you decide that you want to bring your commenting in house without losing all existing comments!. How easy would you get to your data I wonder? They claim that the data belongs to you, but they have the control over it, and there is not much explanation on their site about this.
I'm always leaving comment in disqus platform. Sometimes, comment seems to be removed once you refreshed it and sometimes it's not. I think the one that was removed are held for moderation without saying it.

Connecting to remote hosts in an HTML/Javascript web app

I've been thinking of developing a web application using HTML and JavaScript for a little while now, but I've hit a wall during my ponderings. I want to be able to connect (long-term, not briefly) to a remote host with this app, one which is unfortunately not the server that the page was requested from.
From what I've read, JavaScript can't support long-term connections, and furthermore it won't request from anywhere that's not the domain the page was downloaded from. I considered hidden Java or Flash objects, but Flash seems to cost money, and Java requires a signed applet (and I don't know whether it's worth getting it signed).
The only solution that I think could work is using my server as a proxy to the others (through an unsigned Java applet?), but I really don't want to do that if I can help it. Is that my only realistic option, or are there other solutions I haven't considered yet?
(I considered asking on one of the other SO-alike sites, but StackOverflow seemed most apt, since this is largely a programming and design issue.)
After carefully considering my own plans for the application, I've decided to go forward with the server-as-proxy approach. Having the client handle the connections sounded like a good idea at first, to save on server resources, but it would have made other implementation ideas unworkable. Sticking to a strict server-as-proxy model handily solves these and other issues I was pondering over, so I suppose that's that!

Categories

Resources