Needing a little help in here:
Context: I am able to log in using email no problem. Redirect and url_for working flawlessly.
When I login with google, though... It is logging in, but not redirecting, thus, not reloading the page, not showing the logout button and so on.
relevant code:
python flask authorized login is "http://localhost:5000/oauth2callback":
btw: I know I shouldn't use google's id, I am still testing it.
#app.route('/oauth2callback/<id>/<nome>/<email>', methods=['POST'])
def oauth2callback(id, nome, email):
#print(f'o ID é {id}, o nome é {nome} e o email é {email}')
try:
if User().query.filter_by(email = email).first():
usuario_google = User().query.filter_by(email = email).first()
print(usuario_google)
login_user(usuario_google)
print('usuario logado')
return redirect(url_for('home', next=request.url))
else:
sessao_google = User(username=email, email=email, nome=nome)
senhas_google = Senha(senha='')
db.session.add(sessao_google)
db.session.commit()
db.session.add(senhas_google)
db.session.commit()
print('Registrado')
return redirect(url_for('login', next=request.url))
except Exception as e:
raise redirect(url_for('login'))
finally:
pass
return redirect(url_for('home'))
I will add javascript just in case:
function onSignIn(googleUser) {
var profile = googleUser.getBasicProfile();
var xhttps = new XMLHttpRequest();
var novaurl = 'http://localhost:5000/oauth2callback/'+profile.getId()+'/'+profile.getName()+'/'+profile.getEmail();
console.log(novaurl)
xhttps.open('POST', novaurl);
xhttps.send();
}
Thank you for any ideas/help.
Related
I'm attempting to use scrapy and splash to retrieve the Staff, job titles, and emails from a particular website's staff page. https://www.kennedaleisd.net/Page/3884. I'm using splash with docker since the emails are hidden behind dynamic javascript code.
The spider works on the first page of the staff however I can't seem to get it to work on the 2nd or 3rd pages. I opened up developer tools and have copied the request that is sent when you click on one of the pagination links and then attempted to replicate that request in the spider. The problem I appear to be having is that the response for that request only returns a sub-set of the code for the entire page (Just the staff for that page) instead of everything like the accompanying javascript. So when that is passed onto splash it doesn't have the necessary script to create the dynamic code. I also noticed that the request appeared to have a cookie entry of RedirectTo which goes back to the parent page. I had attempted including that cookie in the requests or passing cookies from the first request to the paginated pages, but it didn't seem to be working. I also attempted some lua scripts in the splash request but that didn't seem to be getting me what I wanted either. Below I've included the spider as I have it right now.
I'm not sure if there's some way to re-use the javascript with subsequent requests or to user that redict cookie in some way to get the rest of the needed code. Any help would be appreciated. I realize the pagination is probably not the proper way to loop through pages but I figured I could work on that once I get the reading of the data figured out.
import scrapy
from scrapy_splash import SplashRequest
class TestSpider(scrapy.Spider):
name = 'TestSpider'
start_urls = ['https://www.kennedaleisd.net/Page/3884']
def start_requests(self):
for url in self.start_urls:
yield scrapy.Request(url, self.parse, meta={
'splash': {
'endpoint': 'render.html',
'args': {'wait': 0.5}
}
})
def parse(self, response):
for item in response.css('div.staff'):
name = item.css('li.staffname::text').get()
title = item.css('li.staffjob::attr(data-value)').get()
email = item.css('li.staffemail a::attr(href)').get()
staffURL = response.request.url
yield {
'name': name,
'title': title,
'email': email,
'staffURL': staffURL
}
if response.css('a.ui-page-number-current-span::text').get() == '1':
pagination_results = response.css(
'li.ui-page-number a:not([class^="ui-page-number-current-span"])::text').getall()
base_url = 'https://www.kennedaleisd.net//cms/UserControls/ModuleView/ModuleViewRendererWrapper.aspx?DomainID=2042&PageID=3884&ModuleInstanceID=6755&PageModuleInstanceID=7911&Tag=&PageNumber='
# backend_url = '&RenderLoc=0&FromRenderLoc=0&IsMoreExpandedView=false&EnableQuirksMode=0&Filter=&ScreenWidth=922&ViewID=00000000-0000-0000-0000-000000000000&_=1584114139549'
for i in pagination_results:
next_page = base_url + str(i) # + backend_url
yield response.follow(next_page, callback=self.parse, meta={
'splash': {
'endpoint': 'render.html',
'args': {'wait': 3}
}
})
Well, after a bit of tinkering I figured out how to handle this with the lua script I had been toying with. I'd still much prefer a different method if there is something that is a bit more official rather than using scripting.
import scrapy
from scrapy_splash import SplashRequest
script_frontend = """
function main(splash)
splash:init_cookies(splash.args.cookies)
assert(splash:go{
splash.args.url,
headers=splash.args.headers,
http_method=splash.args.http_method,
body=splash.args.body,
})
assert(splash:wait(3))
assert(splash:select('#ui-paging-container > ul > li:nth-child("""
script_backend = """) > a'):mouse_click())
assert(splash:wait(3))
local entries = splash:history()
local last_response = entries[#entries].response
return {
url = splash:url(),
headers = last_response.headers,
http_status = last_response.status,
cookies = splash:get_cookies(),
html = splash:html(),
}
end
"""
class TestSpider(scrapy.Spider):
name = 'TestSpider'
start_urls = ['https://www.kennedaleisd.net/Page/3884']
def start_requests(self):
for url in self.start_urls:
yield scrapy.Request(url, self.parse, meta={
'splash': {
'endpoint': 'render.html',
'args': {'wait': 0.5}
}
})
def parse(self, response):
for item in response.css('div.staff'):
name = item.css('li.staffname::text').get()
title = item.css('li.staffjob::attr(data-value)').get()
email = item.css('li.staffemail a::attr(href)').get()
staffURL = response.request.url
yield {
'name': name,
'title': title,
'email': email,
'staffURL': staffURL
}
if response.css('a.ui-page-number-current-span::text').get() == '1':
pagination_results = response.css(
'li.ui-page-number a:not([class^="ui-page-number-current-span"])::text').getall()
for i in pagination_results:
script = script_frontend + str(i) + script_backend
yield SplashRequest(self.start_urls[0], self.parse,
endpoint='execute',
cache_args=['lua_source'],
args={'lua_source': script},
headers={'X-My-Header': 'value'},
session_id='foo'
)
I have a django project where a user has a profile and can upload a profile picture. The models.py is:
`class Profile(models.Model):
user = models.OneToOneField(User, on_delete=models.CASCADE)
name = models.CharField(max_length=64,blank=True)
profilePic = models.ImageField(blank=True, null=True, upload_to= "profile/")
phoneNumber = models.CharField(max_length=12,blank=True)
streetAddress = models.CharField(max_length=64,blank=True)`
On my site, the user can edit his profile including the profile picture. To do so, I have a form, where the initial values are the ones initially stored. The forms.py is:
class EditProfile(forms.ModelForm):
def __init__(self, profile, *args, **kwargs):
self.profile = profile
super(EditProfile, self).__init__(*args, **kwargs)
self.fields['name'] = forms.CharField(label='Name:', initial= profile.name,required=False)
self.fields['phoneNumber'] = forms.CharField(label= "Phone Number:", initial= profile.phoneNumber,required=False)
self.fields['streetAddress'] = forms.CharField(label='Street Address and/or Postal Code:', initial= profile.streetAddress,required=False)
self.fields['profilePic'] = forms.ImageField(label='Profile Picture:', initial= profile.profilePic,required=False)
class Meta:
model = Profile
fields = ("name", "phoneNumber","streetAddress", "profilePic")
This part works great, and on my site I can see the stored values. The problem is when I try to edit them and submit the form.
My views.py is:
def settings(request):
user= request.user
if request.method == 'GET':
userProfile = Profile.objects.get(user=user)
f1= UserProfile(user=request.user)
f2= EditProfile(profile=userProfile)
return render(request, 'listings/settings.html', {'form': f1,'form2': f2})
elif request.method == 'POST':
userProfile = Profile.objects.get(user=user)
f1= UserProfile(user=request.user)
f2= EditProfile(profile=userProfile)
name= request.POST["name"]
phoneNumber = request.POST["phoneNumber"]
streetAddress = request.POST["streetAddress"]
Profile.objects.filter(user=user).update(name= name, phoneNumber = phoneNumber, streetAddress = streetAddress)
if "profilePic" in request.FILES:
image1=request.FILES["profilePic"]
fs1=FileSystemStorage()
fs1.save(image1.name, image1)
userProfile.profilePic = image1
userProfile.save()
else:
userProfile.profilePic.delete()
messages.success(request, 'Your profile has been updated!')
return redirect("/settings")
Everything gets edited with no issues except for the imageFile. If I upload the file, it works and the image is updated. However, if I make no changes to the imageFile (i.e. I want to keep the same image), the request.FILES is empty, and then the code goes to the else statement and deletes the existing profilePic.
My question is that the I can see the initial profile picture on my site, so the forms.py is working, but why isn't it being submitted along with the rest of the form?
Your question is a bit confusing, but it seems that you are trying to reinvent the wheel. If you want to have the "initial" data reinserted into the form, you should use the native instance parameter. You can use it as such:
profile = Profile.objects.get(user=user)
# This can be in your GET
profile_form = EditProfile(instance=profile)
# This can be in your POST
profile_form = EditProfile(request.POST, request.FILES, instance=profile)
profile_form.save()
I'm developing an app in Web2Py that consists in a little e-commerce. Have a controller and page that the link is localhost:8000/topranchos/produto, with products, were topranchos is the app.
In the page produto there are a list of products like this:
The image is in this link
When the button "Adicionar ao carrinho" is clicked, the javascript function is executed:
<script>
function adicionarCarrinho(prod, qtde) {
quantidade = document.querySelector(qtde).value
console.log(quantidade)
if(quantidade > 0) {
$.get("{{=URL(f="adicionarCarrinho")}}", {produto: prod, qtde: quantidade} )
.done(function( data ) {
console.log (data)
var atual =document.querySelector(".badge-carrinho").innerHTML;
document.querySelector(".badge-carrinho").innerHTML =
parseInt(quantidade) + parseInt(atual);
alert("Adicionado ao carrinho com sucesso");
});
}
else alert("Selecione a quantidade de itens deste produto que você deseja");
}
</script>
It's make a requisition to the action default/adicionarCarrinho:
def adicionarCarrinho():
if request.vars:
session.carrinho.append(
#{'produto':db(db.produto.id == request.vars['produto']).select(),
{'produto':int(request.vars['produto']),
'quantidade':int(request.vars['qtde'])}
)
print "----------"
print session.carrinho
return str("OK")
Where session.carrinho have a list that was declared on db.py model:
#carrinho
session.carrinho = []
On the terminal, the command print session.carrinho print the item received by the ajax request, but when I add other itens the list is empty. When I click on the page of carrinho, that shows the session.carrinho's informations, the var is empty.
How can I repair this? I tried use cookies of course Web2Py book, but I dummie on Web2Py and not has success yet :/
thank you!
The model file is executed on every request, so you are resetting session.carrinho back to an empty list on every request. Instead, in the model, this:
session.carrinho = []
should be something like:
session.carrinho = [] if session.carrinho is None else session.carrinho
I'm developing a web application where I'm stuck with a problem in one feature. You can check it out here http://qlimp.com You can also use this for username/password: dummy/dummy
After login, please click the link Go to cover settings You will see a palette where you can upload images, enter some text.
When you upload the image, I've written an ajax request in jQuery which uploads the image to the server and shows fullpage background preview of that image.
JQuery
$('#id_tmpbg').live('change', function()
{
$("#ajax-loader").show();
$("#uploadform").ajaxForm({success: showResponse}).submit();
});
function showResponse(responseText, statusText, xhr, $form) {
$.backstretch(responseText)
$("#ajax-loader").hide();
}
So the problem here is, when I upload the image, it shows
ValueError at /cover/
The view cover.views.backgroundview didn't return an HttpResponse object.
Request Method: POST Request URL: http://qlimp.com/cover/
I'm actually returning HttpResponse object in views.
Views.py:
#login_required
def backgroundview(request):
if request.is_ajax():
form = BackgroundModelForm(request.POST, request.FILES)
if form.is_valid():
try:
g = BackgroundModel.objects.get(user=request.user)
except BackgroundModel.DoesNotExist:
data = form.save(commit=False)
data.user = request.user
data.save()
else:
if g.tmpbg != '' and g.tmpbg != g.background:
image_path = os.path.join(settings.MEDIA_ROOT, str(g.tmpbg))
try:
os.unlink(image_path)
except:
pass
data = BackgroundModelForm(request.POST, request.FILES, instance=g).save()
return HttpResponse(data.tmpbg.url)
else:
form = BackgroundModelForm()
return render_to_response("cover.html", {'form': form}, context_instance=RequestContext(request))
Models.py:
class BackgroundModel(models.Model):
user = models.OneToOneField(User)
background = models.ImageField(upload_to='backgrounds', null=True, blank=True)
tmpbg = models.ImageField(upload_to='backgrounds', null=True, blank=True)
class BackgroundModelForm(ModelForm):
class Meta:
model = BackgroundModel
exclude = ('user','background')
But these things are working on my computer(save the image and shows the background preview) but not in the production server. Why is it so?
I've uploaded the same code to the server.
Could anyone help me? Thanks!
You are not returning a response if the form is valid.
what I'm trying to do is have a upload button to upload files to our storage system. I'm using Google App Engine with Python. Also HTML and Javascript for views.
For that, we have a HTML, and a.js that asks the user if he's sure that he wants to overwrite a file. For that overwriting question, I need to ask the Database to know if it exists,and so if the question should be asked or not...
The thing is I don't know even where to start. I have this confirm() text shown to the user, and a GQL database, but I don't know how to make a question. For example, I upload via a URL, but then I don't have a response for that, and also I don't want to pass a question (name of the file,...) to the database via URL...
Do you have any idea of what path should I follow? Am I trying something impossible or without any sense?
Thanks a lot!
I add some code:
this is the HTML form where we ask the user to upload a file:
<form id="up_file" enctype="multipart/form-data" method="post">
<input type="hidden" name="user_id" value="{{ current_user.id }}"/>
<input type="hidden" name="group_id" value="{{ group.id }}"/>
<p>File: <input type="file" name="filename" id="file_name"/></p>
<p><input type="button" value="Upload" onClick="seguro_sobreescribir(filename,{{ current_user.id }},{{ group.id }})"/></p>
</form>
and this is the javascript that currently tries to send information to our application in google engine when someone clicks the upload button:
function Request(function_name, opt_argv) {
if (!opt_argv)
opt_argv = new Array();
// Find if the last arg is a callback function; save it
var callback = null;
var len = opt_argv.length;
if (len > 0 && typeof opt_argv[len-1] == 'function') {
callback = opt_argv[len-1];
opt_argv.length--;
}
var async = (callback != null);
// Build an Array of parameters, w/ function_name being the first parameter
var params = new Array(function_name);
for (var i = 0; i < opt_argv.length; i++) {
params.push(opt_argv[i]);
}
var body = JSON.stringify(params);
// Create an XMLHttpRequest 'POST' request w/ an optional callback handler
var req = new XMLHttpRequest();
req.open('POST', 'https://safeshareapp.appspot.com/upload', async);
req.setRequestHeader("Content-type", "application/x-www-form-urlencoded");
req.setRequestHeader("Content-length", body.length);
req.setRequestHeader("Connection", "close");
if (async) {
req.onreadystatechange = function() {
if(req.readyState == 4 && req.status == 200) {
var response = null;
try {
response = JSON.parse(req.responseText);
} catch (e) {
response = req.responseText;
}
callback(response);
}
}
}
// Make the actual request
req.send(body);
}
// Comprobar si existe, y si existe, preguntar si se quiere sobreescribir
function seguro_sobreescribir(filename,user_id,group_id)
{
var resp=confirm("Seguro que quiere sobreescribir el archivo "+filename.value+" del usuario "+user_id+" del grupo "+group_id+"?");
if(resp)
{
var result = Request('Upload',[filename,user_id,group_id]);
alert("Hemos hecho request "+ result);
}
}
And this is the RequestHandler that should handle our request:
class RPCHandler(webapp.RequestHandler):
""" Allows the functions defined in the RPCMethods class to be RPCed."""
def __init__(self):
webapp.RequestHandler.__init__(self)
self.methods = RPCMethods()
def post(self):
args = simplejson.loads(self.request.body)
func, args = args[0], args[1:]
if func[0] == '_':
self.error(403) # access denied
return
func = getattr(self.methods, func, None)
if not func:
self.error(404) # file not found
return
result = func(*args)
self.response.out.write(simplejson.dumps(result))
class RPCMethods:
def Upload(self, *args):
status = -1
fileitem = args[0]
userid = args[1]
groupid=args[2]
return status
def main():
app = webapp.WSGIApplication([('/upload', RPCHandler)], debug=True)
util.run_wsgi_app(app)
if __name__ == '__main__':
main()
The fact is that the return status gives us a undefined back at the javascript.
We don't know if we are not uploading the file and if so, how to do it. That's because we have 2 things that we don't know how to put together:
The normal "input type=file, method=post, and submit input type=submit button of a HTML form
Our connection via RequestHandler to the google app engine etc.
Do you have any ideas?
Here's a possibility:
Implement a servlet that can answer the question given a filename; it could return a '0' or '1' (or whatever you choose) as its HTTP response depending on whether the file exists. Then make an XmlHttpRequest POST to that servlet from your javascript with the filename as a POST parameter. Depending on the return of the XmlHttpRequest, show UI to the user asking to confirm.
Are you using blobstore to store the files on GAE?
If so, each blobstore entity has a property filename. Before you submit the form to your blobstore handler, do a query for that filename using BlobInfo.gql(query_string, *args, **kwds): http://code.google.com/appengine/docs/python/blobstore/blobinfoclass.html#BlobInfo_gql
For the "servlet" you can just write the request handler that will output 0 or 1 in response to a filename that is submitted.
Additionally, if you only want one file per user/organization with that file name, you may want to store a separate list of uploaded files and corresponding users/organizations and query that instead.