Personal statement and Career detail

 

Personal Statement

I own problems.

I analyse problems.

I solve problems.

I design systems.

I build maintainable systems.

I make systems robust.

I speed systems up.

I make systems easy to use.

I work on my own.

I work in teams.

I run teams.

I do short, steep learning curve jobs.

I do last man standing on long projects.

I try hard all the time.

 


Career Detail

 

 

29

Sep 14

Dec 14

4 months

Eurotunnel

 


Return to Eurotunnel (for the nth time) to work on the replacement of their current CRM system held under SAP.  Analyse their current data and systems through interview and analysis of documents.  Interview to find out the real requirements.  Start the process of thinking about the problems with all their assumptions and coming up with a data model that supports the requirement.  Produce proposals and plans to carry out the development.  Provide detailed analysis of some of the data problems that they have.  Carry out high level design together with detailed design in some problem areas for a SQL Server environment and document it all.  Provide documents for consumption of senior management and to enable suppliers to quote from.  Provide presentations and give them to various forums.

28

Mar 14

Aug 14

6 months

PD/Charity

 

As part of PD, progress the software text book (see 23) a lot closer to publication.  People actually reading (!) it so deal with their comments.

Complete the last manual data loads and dedup using my tools for the charity (see 26) as they switched to online entry by supporters.  Grant them perpetual licence to my fuzzy matching software.

27
(17)

Oct 13

 

Feb 14

5 months

Sodexo



Return to Sodexo for a fourth time.  Their SSAS BI system in SQL server was producing incorrect results.  Replicate reports in SQL to generate and prove correct results.  Use SQL to make the realisations about the errors in their models and then correct the models.  Provide PL/SQL expertise for website using .Net Web Services.  Contribute to design through analysis of the existing Oracle system.  For example provide single SQL cursors to replace complex interface for providing data for web pages doing it in a stateless fashion with much improved efficiency.  For example identify web team using wrong key for accessing data.  Build complex database functions for the system.

 

26

Feb 13

Oct 13

9 months++

Charity

 

 

Analyse CRM data run under thankQ in SQL database for a charity and report to the director.  Advise on how to improve their data.  Show how the thankQ model was not supporting their data and how to change to eliminate that problem.  Identify the 20% of addresses that were not used anymore and how to remove them – then remove them.  Convert my tools from PL/SQL to TSQL to cleanse their data including removal of duplicates.  Carry out that data cleansing and develop standards to enable volunteers to consistently process the data.  Advise on expansion and how to generate more income from the data.  Set up processes to capture the data and analyse it.  Use my tools to do their data loads when they received data over following months before they switched to online entry.

 

 

25

Sep 12

Jan 13

5 months

Steve Hudson/PD

 

 

Another small incursion into websites building a site for a musical entertainer.  Make a very simple site display static information and play music.
During this time build multiple Oracle databases under VMWARE using Linux as part of my PD.

 

24
(7)

Feb 12

 

Aug 12

7 months

CGG

 

 

Return to CGG Veritas for the 2nd time.  Design and build interface to move data from database pairs at server centres in different continents to remote sites with low bandwidth communications, process the data at the remote sites and return the results.  This was mostly PL/SQL triggers, a package and many package modifications of existing systems.  Interfacing was done to systems with little or no documentation – forensic programming!  Oracle10g, PL/SQL, Toad.

 

23

Aug 11

Jan 12

6 months

 

PD

 

Design a database and implement in My SQL to support a wine web site.  A novel dynamic query mechanism was proposed and implemented.  This was for a start-up company and carried out in unusual circumstances! 
As part of PD convert many of my notes into the first pages of my software development book.

22

Jun 10

July 11

13 months

Ab Initio

 

 

External consultant at Ab Initio.  Lengthy induction process.  Then short engagements: Resolving an interface with Oracle that was running too slow.
Analysing a telecoms invoice program in Informix SQL and writing analysis programs to parse Informix SP SQL, format programs and determine call hierarchies.  Analysed billing programs to discover large number of bills not being created and show loss of revenue.

Advising on database and SAP interfaces for a banking client.

Assisting on three deliveries of the Ab Initio basic training course.

Analysing name and address processing application and developing fuzzy matching.

Teradata interfacing.
PGP graphs creation.

 

21

Dec 09

May 10

6 months

EDF

 

 

Analyse and design the marketing database for EDF in their new Orchard system.  Based on an Oracle database, it took feeds from many sources (SQL Server, Oracle, SAP, flat files) and carried out campaign management and price change functions.  This included synchronisation with SAP CRM to provide feeds for it and to handle all the marketing functions that SAP could not.  The data consisted of all prospects plus customers and all their billing data for many years.  Design the synchronisation interface with SAP and demonstrate need for fundamental changes to the SAP data processing.  Produce the design document, the data model and the functions needed to achieve this.  Warn of all the race conditions and synchronisation problems.   Advise the prototype team on what would go wrong with their implementation and use it to demonstrate the synchronisation problems!

 

20

April 09

Nov 09

8 months

 

C and C

 

Design and build an image processing system used to find duplicates in a forensic database.  This required coding images in a manner that enabled fuzzy matching to handle rotation and scaling of the images (and even stretching) yet still recognise the matches.  Carried out the research and proved the method against a test database.  Written in PL/SQL with Oracle forms to provide the image handling support.

 

19

Oct 08

Mar 09

6 months

C and C

 

 

Take over their name and address matching and knowledgebase.  Analyse the errors in it that had been there for years and remove them.  Re-write much of it to the Javelin standard.  Use the improved tools to win 3 dedup competitions.  Then use the tools to implement two dedup and merge systems for two utility companies.  Design the interfaces with a SQLServer reception database and the source systems.  Carry out optimisation of two systems.  All this work done in PL/SQLusing Toad.

18

May 08

Sep 08

5 months

 

PD

 

Return to certification studies and writing the notes for a software textbook.  Carry out occasional support of Javelin at Informa.  Continue the development of an AI name and address processor.

17

Jun 06

Apr 08

23 months

Sodexhopass

 

 

Sodexhopass had inherited a system that they knew little about.  I was employed to find out what parts of it did, how it did it, and document it.  I found numerous bugs and learnt a lot about PL/SQL and how it can be misused. Then I investigated a very slow process that the in-house people had looked at extensively with little success.  In a short time I was able to take the run time down from hours to minutes…a fresh pair of eyes?  I then looked at several other slow processes.  Finally there was a large many-tentacled upgrade that they had no confidence about promoting to live.  I investigated this and documented its whole-system impact in the development area to give the confidence to deliver to live and carried out a complete upgrade in the test environment.

Then I designed and developed an interface between their bespoke system and the Pivotal CRM.  It was a near real-time feed of all changes from the Sodexhopass system into the Pivotal database on separate servers.  This meant analysing the bespoke system and then designing the database to accept the data within the Pivotal schema.  A number of race-conditions and all the server and communications failure modes had to be catered for to ensure no data was missed.  The system had to be loosely coupled with neither side of the interface dependent on the other side being present.  It also had to be able to recover and ‘replay’ the interface and be auditable.  It also had to be robust and automatically pick up where it left off after system crash or shutdown.  As usual a large optimisations and tuning exercise had to be completed.

The project manager was non-technical (but nonetheless very good) so I also drove much of the project, providing requirement specifications, manpower estimates and dealing with the Pivotal consultants.  There were a number of performance worries so I spent some time carrying out impact analyses on the live database.

 
Next I designed and developed a scheduling system that automated the company production systems.  I created an innovative ‘time-travel’ method that allowed the company to report and predict production into the future for the first time.  I extracted all the user requirements beyond the scheduling, developed the system in PL/SQL and took it into live running.  Documented the system and wrote the user manual.

With a degree of paranoia I studied for the 1Z0-001 exam, to start Oracle certification, when I could probably have (should have) passed with no revision at all.  I scored 100% and completed the 90 minute exam in 22 minutes showing what a waste of time the study was.

Studied (much less than for the previous exam! (see 18)) for Oracle certification 1Z0-147 and passed with 97% thereby becoming an Oracle Certified Associate.  Then I worked toward the 1Z0-101 which is needed for the DBA certification thread and passed with 98%.  Finally, on the developer thread I studied and passed 1Z0-141 with 98% to become an Oracle Certified Professional.

 

16 (8,14,15)

Feb 06

May 06
4 months

 Eurotunnel, Informa PLC, Topnet

 

I completed the development of Javelin by converting some of the forms to be web-based for demonstration to potential clients.  At the same time Eurotunnel and Informa PLC requested help with some complex data loads and extractions.

 

15

Jan 06

Jan 06

1 month   

 Informa PLC

 

 

During the development of Javelin, the support at Informa PLC, and the building of their local requirements I had spent a lot of time tuning the application.  They asked that I condense a lot of my skills into a 1 day course on tuning to give to their developers and DBA’s.

 

14

Jan 02

Dec 05

48 months

Topnet

 

 

It was decided to upgrade Javelin, the CRM system I had designed and first built in 1988 to forms V6, to make it totally OS independent and to add some new features of which the most significant was an AI that would care for the data.  I converted all the forms to V6 and added many more, and converted all the database code into PL/SQL Packages, though some functions had already been moved to PL/SQL.  To give OS independence all the code that resided outside the database was moved to Perl.  This allowed the concept of monitors to be fully implemented which allowed a user to use all functions within the system from anywhere in the world.  It also allowed processing to be sent out to the monitors on whatever machine they were running on thus enabling load management and robustness under failure.  Alongside the monitors a multi-node queue manager was developed that gave a simple interface for complex queue management across many nodes.

The application was a Direct Mail application but it was also a CRM system and had all the ETL facilities of a data warehouse.  The application is still a leader in the area of deduplication and matching with true AI aided fuzzy matching.  Combined with the AI that constantly works on the database to improve the quality of the data it easily adapts to work on any data that is presented to it.

There is a query facility that generates database queries from users that are presented in business terms.  These queries can be used to generate simple data sets to drive a mailing or to drive the multi-dimensional analysis tool.

The system is designed to present an interface to the users driven off data within the database.  Therefore, even with bespoke parts of the application, the same system was delivered to all customers and clients did not need further investment when Javelin was upgraded.

 

I designed and built the whole product.  I also extracted requirements from clients for their customised functions and built those.

 

13 (11)

Mar 01

Dec 02

10 months

 Inphomation/Worldcom

This was the second part of the receivership of Inphomation (see 11).  Over 18 months of undocumented call data had to be extracted from the database and analysed.  As before the incorrectly retained income had to be identified and proven to the courts.  Several other sources of data were discovered and this was used to verify the data in the database.  This then had to be compared to the data held by Worldcom and discrepancies identified.  This data was then used to support lawyer’s arguments in the courts and in negotiation with Worldcom.  I also analysed the data and provided lines of arguments for the lawyers to use.

 

 

12 (6)

Jan 01

Feb 01

2 months

 Maracis/NHS

 

 

Another one of the clients who returned to me when they had problems, asked that I carry out a complex ETL for them at short notice.  Also analysed why several of their systems were very slow and provided huge performance improvements for them.

 

 

11

Feb 00

Dec 00

11 months

 Inphomation/ATT

 

Inphomation ran the Psychic Hotline from Baltimore in the USA and went into receivership (should have seen it coming).  Much of the reason for this was that ATT and Worldcom (see 13) illegally withheld call revenue from Inphomation.  I was brought in by the receivers (Maryland First Financial) to recover their database and analyse that data to support the receivership.  I started with a silent Unix box, no documentation and no staff to talk to.  I first secured a copy of the database (of over 1 billion rows) and all the relevant files on the machine.  Then I analysed all the data in the database to build an ERD and an understanding of all that data.  Next I summarised all the data to identify the missing revenue.  Then I transferred the database to an NT server and loaded all the ATT data of their version of events.  Then there was a lengthy process of matching data to identify the discrepancies.  This was followed by a period of reporting on the data to the lawyers and accountants to refute assertions made by Andersens (the ATT consultants) to maintain our claim.  In the end I never had to testify as ATT settled out of court partly due to the quality of the data analysis I produced.

 

 

10

Apr 99

Feb 00

11 months

Allders

 

 

Coming up to Y2K Allders wanted expert advice on their Oracle databases that they were inexperienced with and help for the Y2K event.  I advised on how they were running their databases and provided a monitoring service that I eventually automated.  I made a number of design decisions for them in 2 large developments.  I had identified that they were vulnerable to database failure with poor recovery.  I ran a series of 13 seminars for their DBA’s where we crashed databases in various different scenarios and recovered them.  Most of the recoveries were very poor so it made the DBA’s converts to proper backup policies so that none of the exotic recoveries were required!

 

9

Sep 98

Mar 99

7 months

Woolwich



The Woolwich Bank needed an Oracle DBA to transfer technology and set up their database to run 24/7.  The database had been set up very poorly so I re-organised it for 24 hour running.  I produced diagnostic scripts to enable close monitoring to enable problem prevention.  I produced fault tree documents for the local support to quickly identify Oracle problems.  I led the team through many disaster recovery exercises.  I provided extensive reporting into management to define how far away they were from true 24/7 working and planned out all the steps to achieve it.  I carried out tuning and tuning analysis of the main 3rd party application.

A large increase in customers was planned so I measured the workload and created an SLA and stress tests.  Then I used some TPC models to predict hardware requirements for the predicted load.

 

8

Oct 97

Aug 98

11 months

Topnet


This short period was used to take the Javelin product out of the original Oracle development into PL/SQL and the latest version of forms.  It also marked the start of the move from VMS based support into platform independence.  At the same time the design of the AI to support marketing activities was begun bringing together many of the existing functions.  Much of the implicit warehouse functions were also formalised.

 

 

7

Dec 96

Sep 97

11 months

Digicon


At the oil explorer, Digicon, monitored their databases and carried out much database tuning.  Developed one of their job systems using Developer 2000, PL/SQL in a Unix environment.  Used state modelling to design the system.


6

Jul 96

Nov 96

11 months

Maracis


Carry out development on data loading for their mental health database.  Bring DBA skills to the company.  Do some crucial database tuning and data loading, in a Unix environment.

 

 

 

5

Jul 96

Nov 96

11 months

British Army


Develop a DBA course for the British Army at Rhinedalen.  Deliver the first version and hand over to the training company.

 

 

4

Apr 88

Jun  96

100 months

Eurotunnel



In April 1988 he went to work for the Eurotunnel Project implementation division as the DBA.  As the only person there with VAX and ORACLE knowledge he set up ORACLE on the machine, set up the VAX development and running environment system.  He tuned VMS to run ORACLE optimally and set up the databases on the system.  He wrote the programming standards, and automated all the DBA functions.  He designed the help system and the security system.  As more staff joined he led the first major developments to successful delivery in August, writing a lot of the forms and PL/SQL himself as well as design and team leading duties.  He sat on the technical review board and advised on technical matters such as strategy for new cpu purchases, interfaces to All-in-one, code management, quality, VMS interface, ORACLE and VMS upgrades, sizing and performance

 

In December 1988 he switched to acting as a trouble-shooter for problems with Oracle, sat on the TRB still, loaded tested and developed with SQLTEXT prior to its introduction, monitored VMS performance and supplied emergency backup for operational problems.

 

In May 1989 he switched to defining and implementing the quality function for the division.  A little late in the day, but he started the process of turning the department into one that produced maintainable products to a rising set of standards.  He then implemented quality control on a number of sub-systems. When the permanent 'quality' staff were hired he handed over to them and assisted them (which they were much better able to do!) in setting up a proper quality assurance operation.  Then he continued in an advisory capacity reviewing the software procurement proposals, advising on expansion into a database server based on Unix on Sequent hardware and assisting with the introduction of V6 of Oracle and the new Oracle products.

 

In November 1990 he designed a text retrieval system to support various activities in Eurotunnel.  This system linked with an image system to get text from scanned documents and integrated that information with other existing systems.  He designed in other features such as text enhancement and dual language capability.  From February through to August he built the system with another person and saw it through initial data load into normal running by October 1991.

 

During all this time he acted as supporter and advisor to other staff on Oracle and VMS and also acted as standby DBA.  He continued to do this and also work on a number of  large internal systems until January 1992 when he became a founder member of the integration team.  His task was to build the tools to support integration and then carry out that testing.  He designed the system and then led a small team in the building of the system which today is one of the core ET development tools.

 

As the development phase finished and the move to a production environment started he moved back to the QC role full time.  His major task was to analyse all the code to improve resource usage.  Spectacular success was easy initially and then a long period was spent analysing that resulted in many changes to the data structure and code to improve both client and server performance.  He read nearly every line of code in ET and reported bugs in code together with fixes into the high hundreds.  He was also able to point out many ways to save space as well as speed things up.

 

Then he provided all the technical input on the Oracle and VMS side to carry out a full scale disaster recovery exercise switching countries to new 'LIVE' machines.

 

 

3

Mar 87

Feb 88

12 months

Maclarens



He developed their direct marketing system and improved the design in many places by bringing in ideas from real-time processing.  He acted as DBA for all that time for a system that was under-powered, carrying out a lot of tuning and optimisation.  This was done in a VMS environment using SQL, DCL and Pro*Cobol.

 

 

2

May 86

Feb 87

10 months

Stock Exchange



A member of the Billing team for ‘Big-bang’ developing the SEAQ system.  Writing Pro*Pascal programs and SQL.  The start of being a DBA.

 

1

 

Until Apr 86

 

 

Real time development and Research




He designed and built multi-threaded, multi-processor real time systems for radar and telecommunications companies.  The last task was to design the control system for BT’s main exchange switches.  This was multi-processor with two synchronised systems.  After the design he led the implementation team building much of the system infrastructure but also running a team of 10 to implement the complete system in two and a half years.  This was carried out in a Unix environment using Pascal and C and an Intel chipset.

Before working in developing real time solutions he carried out research into manipulating large data sets using parallel processors.  Some of his algorithms are still in use in the NAG library.