While developing these demonstrations in logistic regression and neural networks, I used and discovered some interesting methods and techniques:
A few useful commands and packages...:
update.packages() for updating installed packages in one easy actionas.formula() for creating a formula that I can reuse and update in one action across all my code sectionsView() for looking at data framesfourfoldplot() for plotting confusion matricesneuralnet for developing neural networkscaret, used with nnet, to create predictive modelplotnet() in NeuralNetTools, for creating attractive neural network models
Resources that I used or that I would like to explore...
MS Azure Notebooks, for working online with Python, R, and F#, all part of MS's data workflowsEfficient R Programming, that seems to have many good tips on working with RData Mining Algorithms in SSAS, Excel, and R, showing various algorithms in each technologyR Documentation, a high quality, useable resource
Microsoft provides free access to its online Visual Studio Team Services (VSTS), and for some time I've been using the service, I've wanted to restructure my code hierarchy, and recent changes in my work environment, automated build and deployment using Octopus, nudged me to finally take the task on, so in the past few weeks I've: Restructured my Code library into one big project with sub-projects for Development, Websites, and WorkDeveloped my Work hierarchy of Epics, Features, Stories and Tasks, along with queries and sprint boardsAutomated all of my builds via check-in, adding extensions to evaluate code and build qualityDeveloped a dashboard to oversee the status of work Code Library
I was frustrated with the limitations of working with my code library, and after reading opinions on best practices, I settled on one big project for my code, which I assumed would make it easier to manage my time and energy, and give me a global view of my individual projects.