This month’s main project was the user search for MWoosh I mentioned last month. Along with this project, I also acquired some new technical knowledge.
The main thing for this MWoosh advanced user search was trying to figure out how to make it so you can pick and choose what parameters you want to search by. Very fortunately, however, there was a tutorial on the very thing I wanted to achieve and it was quite a good tutorial to the point where I was able to easily change the code to the work with my data. The way it worked was quite simple, all it did was check if an input had been requested and if so add that to the SQL query. A colleague told me to have a look at something called Elastic Search and from what I can gather you basically search JSON, compared to SQL queries I was doing. However, I had quite some difficulties setting it up and it came to a point where it wasn’t really worth putting more effort into seeing as I already had something working. The search page is also just going be to used internally so I didn’t really have to worry about making it as efficient as possible. The front was a bit more of a challenge, to display the results I chose to use Bootstrap Table as during my research and testing I found this would make it easy to group by columns and select rows. The second part of this project will be selecting a user(s) from the search and sending them messages (which could also include an invitation to a promotion). These messages were only going to be 1-way so it would relatively simple to set up, just a table with the user id and message. It also contained a has read column, this would allow me to show the number of new messages the user has (like the facebook notification) The message would contain some HTML, mainly just a button. This button would either take you to a page to apply for a promotion or allow you to just accept the invitation. The apply page would contain a textbox to allow the user any additional info such as why they should be accepted. Again, I just needed to set up a new table with user and promotion id, and the additional info column. After these things were done all that was left to do was put it live, but I finished these just before I went on holiday so I will put this live when I am back (which will be the upcoming week).
Another project I did was on the competition platform and it was creating a sort of media pack/info page for each of our competition sites. One thing I was very about on this project is the new technical apprentice (who has a design background) would create a mock-up for the front end and I would just have to implement. Combined with me most of the time struggling with designing a front-end and my perfectionism I always end up spending a stupid amount of time on the tiniest little details. So as mentioned above, I was happy to find out I wouldn’t have to do the front. The back-end though was quite a challenge as some of the queries to get the stats we wanted (such as average entries per site) were quite large and took a fair amount of time. I found this out the hard way by going to the page when I put it live (the dataset on our dev database was small enough that I didn’t notice until it was live) and it taking a long time and eventually getting a 504 gateway timeout error. (Which you sometimes get when the website is down), after getting a mini heart attack I realized it wasn’t down… After consulting our senior dev, he thought it would be best that we create a script for this query that will run every month, and there store that information in a table. (He also introduced to table indexes, which help with finding column values quickly and so if used correctly can increase the speed of your SQL queries). Doing it this meant I would just have to query a table. However, the dataset was that big that I even struggled to run just to get the initial data in the table. Part of the script involved using an array but because of the size of the dataset, the server was running out of memory. I looked into using a SplFixedArray, but even still, I was still running out of memory. But after much trial and error, I managed to work out a way to get the data without using arrays and so I have a script that runs every month. One annoying thing about this project thought was that I was tasked with finding them and extracting the information for the media pack side of the page. This involved me trawling through Time Inc info on the site and looking for information to put, this didn’t seem like the best time spent for a programmer. But I managed to scrape together enough info for the pages.