Twitter's Bootstrap typeahead setup - javascript

I'm using the official examples from Twitter.
The main problem, I probably don't know how to use the Hogan monster. The JS side:
$("#search_name").typeahead({
name: 'name',
remote: {
url: '/entities/search_autocomplete.json?query=%QUERY',
template: '<p><strong>{{id}}</strong> – {{name}}</p>',
engine: Hogan
}
});
The server is returning the data in JSON, the structure is:
[{\"id\":1234,\"name\":\"Blah blah...\",\"tokens\":[\"blah...\",\"blah\"]}]

Just took this code from one of our projects, should help you understand the necessary markup of converting external JSON arrays and outputting in a custom autocomplete prompt:
$('input').typeahead({
header: 'Your Events',
template: [
'<img class="ta-thumb" src="https://graph.facebook.com/{{id}}/picture?type=square" />',
'<p class="ta-h1">{{name}}</p>',
'<p class="ta-p">{{start_time}}</p>'
].join(''),
limit: 3,
remote: {
url: 'https://graph.facebook.com/me/events?access_token=' + access_token,
filter: function(parsedResponse) {
var dataset = [];
for(i = 0; i < parsedResponse.data.length; i++) {
dataset.push({
name: parsedResponse.data[i].name,
start_time: parsedResponse.data[i].start_time,
id: parsedResponse.data[i].id,
value: parsedResponse.data[i].name,
tokens: [parsedResponse.data[i].id, parsedResponse.data[i].name]
});
}
return dataset;
},
},
engine: Hogan
});
You need to download the Hogan.js template compiler and include it in your markup (e.g. using it as an external script or via a module loader like Require.js). This will then set the Hogan variable.
I'd also recommend looking at that Graph API call to understand the array conversion better.
Hopefully this helps :)

Related

Web Scraping with Javascript?

I'm having a hard time figuring out how to scrape this webpage to get this wedding list into my onepager. It doesn't seem complicated at first but when I get into the code, I just can't get any results.
I've tried ygrab.js, which was fairly simple and got me somewhere but then I can't seem to scrape the images and it only prints the output in the console (not much documentation to go on).
$(function() {
var $listResult = $('#list-result');
var kado = [];
var data = [
{
url: 'https://www.kadolog.com/fr/list/liste-de-mariage-laura-julien',
selector: '.kado-not-full',
loop: true,
result: [{
name: 'photo',
find: '.views-field-field-photo',
grab: {
by: 'attr',
value: 'src'
}
},
{
name: 'title',
find: '.views-field-title .field-content',
grab: {
by: 'text',
value: ''
}
},
{
name: 'description',
find: '.views-field-body .field-content',
grab: {
by: 'text',
value: ''
}
},
{
name: 'price',
find: '.price',
grab: {
by: 'text',
value: ''
}
},
{
name: 'remaining',
find: '.topinfo',
grab: {
by: 'text',
value: ''
}
},
{
name: 'link',
find: '.views-field-nothing .field-content .btn',
grab: {
by: 'attr',
value: 'href'
}
},
],
},
];
ygrab(data, function(result){
console.log(JSON.stringify(result, null, 2)); //photos = undefined
});
Then there's Node.js with Request and Cheerio (and I tried Crawler too), but I have no idea how node works.
var request = require("request");
This gives me an error in the console saying require is not defined. Fair enough, I added require.js to the scripts in my page. I got another error ("Uncaught Error: Mismatched anonymous define() module: ...").
My question is this: Is there a simple Javascript way (possibly without involving node?), to scrape the wedding list I'm trying to get? Or maybe a tutorial that resembles what I'm trying to do step by step ?
I'd be truly grateful for any help or advice.
i think your only issue is the img selector.
Change
{
name: 'photo',
find: '.views-field-field-photo',
grab: {
by: 'attr',
value: 'src'
}
},
To this
{
name: 'photo',
find: '.views-field-field-photo .field-content img',
grab: {
by: 'attr',
value: 'src'
}
},
I actually can't test this right now, but it should be working!!
Node.js is a seperate application that executes javascript independent of a web page.
require is Node's way of importing packages, and isn't defined by the browser, require.js is a javascript library for requiring packages, but it doesn't work the same way as Node's require function.
To use request and cheerio, you'd need to install Node.js from here, then install request and cheerio with the following commands:
npm install request --save
npm install cheerio --save
Then any code you write with Node.js in that directory will have access to the modules.
Here's a tutorial to web scraping in Node.js with cheerio.

Upgrading ExtJS in old ASP.NET application from 2.3 to 6

Looking for some assistance. TLDR version: we have an ASP.NET web app that leverages ExtJS 2.3 and we are looking to upgrade to the current ExtJS version. Trying to get my head around what we’re in for.
Now for the details. I will preface by saying that I am not an expert in ExtJS nor .NET development. In fact, I’m a novice pretty much across the board when it comes to web development, so please excuse any poor explanations or misuse of terms on my part. My team is developing a web app on a “custom” framework that was developed a number of years ago at our company. It’s based on some re-runnable code generation tools that take xml templates and spit out the necessary code files. Our project is an ASP.NET MVP application that uses .aspx pages and NHibernate for ORM. Our UI is created from ExtJS—the controls are defined in each page’s .js file and then “assembled” in the .aspx page. The codebehind contains web methods that leverage the presenter of the C# code. I’ve included a snippet to demonstrate what I’m talking about below.
.aspx page:
<%# Page Language="C#" AutoEventWireup="true" CodeBehind="Entity.aspx.cs" Inherits="View.Example.EntityView" MasterPageFile="~/MasterPages/Content.Master" %>
<asp:Content ID="Content1" runat="server">
<script language="javascript" type="text/javascript" src="~/Scripts/ext-2.2.1/ext-all.js"></script>
<script language="javascript" type="text/javascript" src="<%=ResolveUrl("~/Scripts/Factory/Example/Entity.js")%>"></script>
<script language="javascript" type="text/javascript">
var localConfig = new panelConfig();
localConfig.applyExtendedConfig('default_page');
localConfig.addItem(new Ext.grid.GridPanel(pageConfigs.default_page_ManageEntity));
localConfig.addItem(
new Ext.form.Hidden({
id: 'ManageEntityGrid_Rows'
}));
var default_page = localConfig.createExt();
default_page.on('render', default_page_OnShow, default_page, { single: true });
</script>
</asp:Content>
.js file:
var get_manageEntity_columns = function() {
var columns = [
{ header: "Name"
,id: 'ManageEntity-col-Name'
, dataIndex: 'Name'
, sortable: true
},
{ id: 'ManageEntity-col-ActiveFlag'
, header: 'Active Flag'
, dataIndex: 'ActiveFlag'
, hidden: true
,tags: []
, sortable: true
},
{ id: 'ManageEntity-col-CreatedTimestamp'
, header: 'Created Timestamp'
, dataIndex: 'CreatedTimestamp'
, hidden: true
,tags: []
, renderer : formattedDateTime
, sortable: true
},
{ id: 'ManageEntity-col-Id'
, header: 'Entity ID'
, dataIndex: 'Id'
, hidden: true
,tags: []
, sortable: true
}
];
return columns;
}
var get_grid_reader_manageEntity = function(custom_fields) {
var fields = [
{ name: 'ActiveFlag', mapping: 'ActiveFlag' },
{ name: 'CreatedTimestamp', mapping: 'CreatedTimestamp' },
{ name: 'Id', mapping: 'Id' },
{ name: 'Name', mapping: 'Name' }
];
if (custom_fields) {
fields = fields.concat(custom_fields);
}
return new Ext.data.JsonReader({
root: 'Results',
totalProperty: 'Total',
id: 'Id'
}, fields);
}
var get_grid_datastore_manageEntity = function() {
var store = new Ext.data.Store({
proxy: new Ext.data.PageMethodProxy({
pageMethod: 'GetManageEntity'
}),
reader: get_grid_reader_manageEntity()
, remoteSort: true
});
store.loadOrReload = function() {
if (store.rapidLoaded)
store.reload();
else
{
store.rapidLoaded = true;
store.load({ params: { start: 0, limit: gPageSize }
});
}
}
get_grid_datastore_manageEntity = function() { return store; };
return store;
}
var pageConfigs = {
default_page_ManageEntity: {
store: get_grid_datastore_manageEntity(),
columns: get_manageEntity_columns(),
viewConfig: {
forceFit: true
},
sm: get_manageEntity_sm(),
layout:'fit',
frame: true,
id: 'ManageEntity',
plugins: [
grid_filters_manageEntity
],
iconCls: 'icon-grid',
loadMask: true,
stripeRows: true,
bbar: get_grid_paging_toolbar_manageEntity(),
listeners: {
rowcontextmenu: show_grid_menu_manageEntity
,bodyscroll: function() {
var menu = get_grid_menu_manageEntity();
if (menu.isVisible()) menu.hide();
}
,headerClick: function() {
this.getStore().on('beforeload', this.saveState, this, { single: true });
}
,render: function(){
var grid = this;
Ext.onReady(function() {
add_applied_filters(grid);
var grid_state = Ext.state.Manager.get('ManageEntity') || {};
if (!grid_state.default_filter_applied) {
var filters = grid_filters_manageEntity;
var activeflag_filter = filters.getFilter("ActiveFlag");
activeflag_filter.setValue(["", new Array("1")]);
activeflag_filter.setActive(true);
grid.on('beforestatesave', function(grid, state) { state.default_filter_applied = true; });
}
grid.getStore().load({ params: { start: 0, limit: gPageSize }
});
});
}
}
}}
.aspx.cs file:
[WebMethod()]
public static ExtJSGridData GetManageEntity(PageProxyArgs args)
{
var watch = new Stopwatch();
watch.Start();
try
{
var data = new ExtJSGridData();
var criteria = GetManageEntityQuery(args);
criteria.SetFirstResult(args.Start).SetMaxResults(args.Limit);
data.Results = GetDataManageEntity(args.RecordId, criteria);
criteria.SetFirstResult(0).SetMaxResults(RowSelection.NoValue);
criteria.ClearOrders();
data.Total = criteria.SetProjection(Projections.CountDistinct("Id")).UniqueResult<int>();
data.UserUiStateSaved = UserUiStateHelper.SaveUserUiStateInTransaction(args.UserUiState);
watch.Stop();
PageLogHelper.CurrentLog.ServerTime = watch.ElapsedMilliseconds;
return data;
}
catch (Exception ex)
{
LogManager.GetLogger((MethodBase.GetCurrentMethod().DeclaringType)).Error(ex);
ErrorHandler.LogError(ex);
throw;
}
}
private static IList GetDataManageEntity(int id, ICriteria criteria)
{
var list = criteria.List<Model.BusinessObjects.Entity>();
var jsonList = Model.BusinessObjects.Entity.ToJSON(list);
return jsonList;
}
private static ICriteria GetManageEntityQuery(PageProxyArgs args)
{
ICriteria criteria = StaticPresenter.GetEntity();
var helper = new GridFilterHelper(criteria, args, _dManageEntityLookupSortInfo);
helper.ApplyFilterMap(EntityJSON.GetGridFilterMap(criteria, args.Filters));
MapManageEntityFilters(args.Filters, criteria);
helper.ApplyFilters();
if (args.SortInfo == null || string.IsNullOrEmpty(args.SortInfo.FieldName))
return criteria;
IList<IProjection> sortMap = StaticPresenter.GetSortMap_ManageEntity(args.SortInfo.FieldName, args.RecordId, args.ExtraParams, criteria);
if (sortMap == null)
sortMap = EntityJSON.GetSortMap(args.SortInfo.FieldName, criteria);
helper.ApplySort(sortMap);
return criteria;
}
So, here is where the question comes in. As mentioned, the version of ExtJS we’re using is 2.3, and we’re looking to upgrade to the current version. I’ve done some initial homework of googling and looking through the sencha documentation, but there are some things which I’m unclear on and would like to get addressed before I start getting hands on with this effort. I’ve tried to outline my specific questions below.
First and foremost: Is the way our application is built even possible with ExtJS 6? By this, I mean leveraging the ExtJS API to define controls in the .js file and then create a UI on top of a .NET C# backbone. Based on the change notes and questions from other users, it’s pretty apparent that there have been massive (understatement) changes between 2.3 and 6. I guess what I’m getting at is that based on what I’ve read it seems you can now build your entire app, including the model and view (and controller?) in ExtJS. Is this a requirement, or can we still lay ExtJS controls on top of our .NET C# model and view?
As a follow up, I’ve been seeing references about Sencha CMD to create and build the app etc. Is cmd going to be required no matter what? Or can we simply reference the ext js library like we’re currently doing?
Assuming the answer to question 1 is yes it’s possible, the next obvious question becomes: how much work is this going to be? Let’s get the “a lot” answer out of the way—I know. What I do know is that we will have to update all of our templates to use the new API syntax (new Ext… to Ext.create() etc). I’m okay with this. What I’m trying to figure out is what I don’t know. Assuming I update all of the syntax, would our application work? Or are there other things I need to change/fix in order to get it working?
Related to question 2: based on my reading it looks like the way data stores for controls has changed and they now use the model defined in ExtJS. Is this a requirement? As described earlier, we’re currently using web methods in the aspx.cs file. Am I going to need to duplicate our C# model in ExtJS?
Lastly, I see this asked a lot but I can’t seem to find a definitive answer. Classic vs modern? The answer I typically see is that modern is aimed more towards touch screens and modern browsers, while classic is more geared toward desktop users. I’ve also read in places that modern has fewer controls available. Our web app is running in a local environment and will not be going to mobile in the future, which leads me to think classic might be the right choice? I guess I’m just wondering technically what the difference is.
I’m sure there are things I don’t even know I’m missing. Any and all feedback is welcome.
It is possible, but you will have to do a lot handwriting. Just three weeks ago I had to leverage a 3.4 ASP to 6.2.1
You can either set the variables to a global variable and on start add these to the mainView ViewModel or load them right away onBeforeLaunch.
Then code your app and build it using Sencha CMD. At the end add all together in your ASP stuff.
About how much work ... depends a lot on how structured your code is, how easy it will be to rewrite the code.
Let's pretend it is written in the same style all over the application, then it will be relatively easy.

Nested attributes with Angular.js

I have been racking my brain and google all morning trying to figure this out but I have come to the conclusion that I need to ask the experts! I am trying to do nested attributes with Sinatra and Angular, don't worry about the Sinatra side of things I am just trying to get the data to the server side in the correct manner at the moment. Please see the code below for an explanation of
My Input:
<input type="text" placeholder="{{item.placeholder}}" ng-model="question.possible_answer_attributes[$index][title]" class="form-control" />
My model object:
$scope.question = new Question({
poll_id: parseInt($routeParams.id),
title: '',
kind: 'open',
possible_answer_attributes: [] // I believe the issue may lie here
});
My factory:
.factory('Question', function($resource) {
return $resource('/api/questions/:id', { id: '#id' }, {
'update' : { method: 'PUT' },
'get_by_poll' : { method: 'GET', url: '/api/questions_by_poll/:id', isArray: true }
});
})
My object at time of running save function:
{"poll_id"=>1, "title"=>"123123", "kind"=>"multiple", "possible_answer_attributes"=>[{"undefined"=>"412312"}, {"undefined"=>"1234124"}, {"undefined"=>"234235234"}]}
I do not understand why my "possible_answer_attributes" keys are coming through as "undefined". It may be something very simple that I have missed, but any feedback would be great!
Thanks in advance!
In order to address the title property, you would need to use a string to index into the object:
ng-model="question.possible_answer_attributes[$index]['title']"
This should hold as long as possible_answer_attributes array looks like:
[{ title: 'my title' }]

twitter typeahead.js and access to element

I am using the latest twitter typeahead.js. I would like to access the data-url property within the remote option....like the following...
I should be able to reference the url property via $('#txt2').data('url') however I would like to have a generic type ahead js function. I am guessing $(this) doesn't have the same context as the valueKey usage.
Any ideas how I can reference the element that is using type ahead within the remote option?
<input type="text" id="txt2" data-name="Locations" data-url='Employee/Locations' data-valueKey="Description" />
$('#txt2').typeahead({
name: $(this).data('name'),
minLength: 2,
valueKey: $(this).data('valueKey'),
remote: {
url: '',
replace: function () {
alert(self.data('valueKey'));
var u = $(this).data('url') + '?q=%QUERY';
alert(u);
return u;
},
filter: function (parsedResponse) { //parsedResponse is the array returned from your backend
return parsedResponse;
}
},
template: [
'<p class="name">{{Description}}</p>'
].join(''),
engine: Hogan // download and include http://twitter.github.io/hogan.js/
}).on('typeahead:selected', function (obj, datum) {
alert(datum.Code);
});

Typeahead.js: Force reload of local dataset

I want to implement a raw fallback option akin to Twitter's web app where it has a final autocomplete option of the value of the input (e.g. Search all people for {{input.val()}}):
My current implementation fails because Typeahead.js doesn't reload local datasets thus the desired effect only happens on the first keyup event:
var plusone = [
{
value: '',
tokens: ''
}
];
$('#name').keyup(function () {
plusone[0].value = $('#name').val();
plusone[0].tokens = $('#name').val();
});
$('#name').typeahead(
[
{
local: plusone
}
]
);
According to the documentation and this tutorial there is no way of reinitialising typeahead without destroying it first which I'd prefer not to have to do for performance. Any suggestions on a better implementation or a fix would be much appreciated (if anyone from Twitter is out there I'd love to know your implementation).
You can do this by adding a new dataset to typeahead with a custom source function.
var nbaTeams = new Bloodhound({
datumTokenizer: Bloodhound.tokenizers.obj.whitespace('team'),
queryTokenizer: Bloodhound.tokenizers.whitespace,
prefetch: 'nba.json'
});
nbaTeams.initialize();
$('#autosuggest-input').typeahead({
highlight: true,
hint: false
}, {
name: 'nba-teams',
displayKey: 'team',
source: nbaTeams.ttAdapter(),
templates: {
header: '<h3 class="league-name">NBA Teams</h3>'
}
}, {
name: 'advanced-search',
displayKey: 'name',
// For every dataset, typeahead expects you to provide a source property
// which is a function that accepts two arguments: query and cb. And you
// can do whatever you want in that function. In this case, what we do
// is that regardless of the query provided, you will always return the
// same result.
source: function(query, cb) {
var result = [{
'name': 'Advance search for "' + query + '"'
}];
cb(result);
},
templates: {
header: '<div style="border-top: 1px solid black;"></div>'
}
});
Demo & credits: http://plnkr.co/edit/cjL6nZtShyxmLjWxzdBC?p=preview

Categories

Resources