Image Template Site

Recently I was tasked with building a new template for our Photo and Production teams to make it quicker and easier to view our templates for each of our product category and subcategories. Currently our team members use a version of smartsheet with nested categories and an overload of information.  My goal for the team is to build a web application that is image-centric and pertinent meta-data is displayed in a concise and out of the way format.  One of the big issues that we have is smartsheet displays primarily in text, is slow to load, and is difficult to wade through to get to the data that you need.

By replacing smartsheet with 3 drop downs that auto-populate options, we don’t need to spend time looking up text data when 95% of the time images only are the data that needs to be delivered.  On the backend, the data is stored using JSON, the front end powered by jQuery.

 

Live link

Project: KeyWordExtraction from CSV

Using PapaParse, I’m able to parse out a relatively large CSV file quickly into a JS Array..of Objects. My biggest issue with the way PapaParse returns the data is that it doesn’t allow me to choose the first field as the object key, so I’ve been learning the .filter function (though I was worried that iterating over an array of objects in JS would be slow especially with a dataset that large).  The takeaway here is that I can run a custom function on each ELEMENT of the array (and pass the anonymous function a temporary reference named however I like).  In this case my function looks like this:

 var newArray = workingResArray.filter(function(el){
 // el is a reference to each array ELEMENT that the filter is iterating through
 for (var i=0; i<searchArray.length; i++)
 {
 if (el['Long Name']==searchArray[i]){console.log("we found:",searchArray[i]);return el;}
 }
 }); // end of anonymous function inside of filter

This wasn’t immediately apparent for me and took a little bit of tinkering (It’s been a while!).  Inside these functions I searched for the relevant key data and in my latest iteration I load all the relevant data from the export (now 200k rows and about 18mb).  A search of 20 unique entries takes about 80ms total to compile the relevant data and write it out to a CSV for one of our plugins, cool!

Encoding woes

This project is a fun time buster utilizing python to retrieve and parse out an ICS file for use in a larger project.

Topics covered so far in this exercise are importing and utilizing the requests module as well as breaking myself of the habit of using trailing semicolons!

One big issue I was running into: encoding. On my local python install / IDLE, the encoding was correct and all nonstandard English characters were showing up while printing in the console. To be completely honest I didn’t put much thought into encoding before but now after 3 days of reading, it’s finally starting to dawn on me! The secondary encoding issue I was encountering was on the server level, the encoding in my server’s shell was set some ANSI encoding and despite setting the encoding in the python file, I couldn’t get the server to use the proper encoding so when the script was run on the server, the console printed out a couple garbage characters!

Solution: editing my .bash_profile and .bashrc files to set the encoding worked a treat (but I’m still not entirely convinced this is the correct way despite being explicit in the file itself…).

My next task is to figure out how to get the characters to display correctly on the html file that I output, though after reading Joel Spolsky’s, “The Absolute Minimum Every Software Developer Absolutely, Positively Must Know About Unicode and Character Sets (No Excuses!)“, it’s starting to make more sense and that the lack of encoding data in the head of my rendered html document probably has to do with it displaying weird in my browser!

<meta http-equiv="Content-Type" content="text/html; charset=utf-8">

Edit:  I was incorrect! My encoding was wrong but it was in ANSI not UTF-8.  I’m still head scratching and need to do some more work on understanding coding better!.  Changing the charset from utf-8 to ansi caused the special characters to render correctly in the HTML document.

Question marked as answered (for now)

Python and cron jobs

I’m moving on from jQuery and the JS portion to do more server side and cross-domain programming for my next set of programming goals. Learning to install python3 on dreamhost was not impossible though I am still stuck a little bit on figuring out a couple small details:

  1. Programs run in the shell run fine using “python3 scriptname”.
  2. Scripts accessed via the webserver do NOT run using python3 (using /usr/bin/env python3) (.bashrc and .bash_profile have the $PATH variable set according to the dreamhost tutorials)
  3. CRON jobs work after I prepend the original call with
    source ~/.bashrc; rest of code
    Helpful solution here

I would ideally like to have #!/usr/bin/env python3 so I can write and test scripts locally before uploading and not need to change it.

My current workaround is by specifying the version of python I find using which python3 command at the top of the script.

 

Question:  Why is /usr/bin/env python3 not working?

This script works when I change the shebang line to point to my local install of python3 (3.6.2). Otherwise at runtime (served up by accessing the file via browser) yields python 3.4.x. dreamhost runs python2, I installed 3.6.2, where is this other version coming from??

 

CSV manipulation

Another small project for work in which the department requests to take a weekly export of data in the form of a CSV, or more accurately a list of semi-colon separated values (ssv?) and append certain data depending on the year found within each row.

Another fun quick project written using javascript/jQuery that utilizes the FileReader() class in jQuery and the onchange event in javascript on the <input type=”file”> element to load that file into a new instance of FileReader object and begin an asynchronous process that loads the lines into an array. From there the array is copied over and the relevant keywords/data are appended to the end of each array element according to predefined criteria.

This code is executed and creates the new payload client-side.

Script Here