Tuesday, December 31, 2024

2020-2024 Recap

What Worked…

The last 4 years have been a unique time in my career for several reasons.  Due to the worldwide COVID-19 pandemic many universities and businesses, including UCLA, transitioned many employees to remote positions so I found myself working from my home office full-time; UCLA switched to completely remote teaching and learning relying heavily on Zoom which formerly had been used at the University primarily as a business application and not a mission-critical teaching and learning tool; and as a direct result of my work with Slack my responsibilities were expanded to include both the campus enterprise Zoom service and our newly implemented enterprise Adobe Creative Cloud service with Adobe CC user license provisioning managed through the product Kaltura. In addition the UCLA CIO initiated a two-year department transformation process in 2022 requiring all department employees to apply for positions that were posted in a series of waves concluding in November 2024. As a result a number of experienced people left the department and new hires often had no UCLA experience and often no academic experience, leading to confusion over the current practices, policies, and implementations of products and enterprise services. I also had 4 directors in 5 years, adding difficulty to achieving continuity, growth, and appropriate recognition.

Being a solutions architect with no supporting staff this was a lot to take on but the 2021 retirement of a director who was not being replaced necessitated transferring his projects and as the CTO stated at the time to me, “no good deed goes unpunished.” I immediately began developing technical documentation for the services and assessing gaps and risks with the goal of providing stable, secure, and reliable services. My greatest fear being a system wide failure in a service required for teaching and learning.

Each of the services I owned, managed, and supported - Slack, Zoom, and Adobe Creative Cloud - had unique characteristics and different support models. While I would have liked to consolidate at least Zoom and Slack under a single collaborative services model, the CTO was not supportive of my proposals and was not open to increasing my authority or assigning resources to assist me. Due to the 24x7x365 criticality of these services, the continuous questions about Slack, the often urgent requests for Zoom licenses and Slack workspaces and workspace migrations, and the June 30 fiscal year end and quarter-based academic calendar, I found that typical vacation times around holidays were not possible and as a result acquired over 400 hours of unused vacation time and a significant amount of stress.

Enterprise Slack Grid

From its launch in November 2020 until 2022 I was the primary support for Slack. The IT Services helpdesk had no trained Slack resources and no plan to review or respond to any posts in Slack since they preferred to work off of ServiceNow tickets created via email.  In keeping with Slack’s recommendation our rapidly growing community of Slack workspace owners and administrators were encouraged to ask questions in Slack channels so that the channels would over time become a valuable self-help resource allowing newer owners and admins to resolve issues without having to contact anyone and encouraging the growing community to help each other. I supported the workspace owners and admins so that they in turn could better support their workspace members; a distributed support model that allowed for rapid growth without requiring the hiring of additional support resources. I personally created over 200 Slack workspaces and migrated over 200 workspaces, working closely with a wide range of instructors, researchers, and department administrators. All migrations were scheduled after hours so that there would be no workspace outages during work hours, and as the sole workspace migration coordinator I worked over 200 hours of uncompensated after hours time to ensure that workspace users would not experience service disruptions. 

In 2022 the CTO assigned Slack administrative rights to a resource in the IT Services helpdesk, so from that time on that individual began creating Slack workspaces and responding to some end user requests and questions. The helpdesk manager had no interest in handling Slack migrations though, so I continued to coordinate all migrations and continued to do so through October 2024.

Campus Zoom

Unlike Slack, end user support for Zoom was provided by the IT Services helpdesk. There was also a group that focused specifically on teaching and learning using Zoom.  Prior to the pandemic Zoom had been managed by a unit outside of IT Services but when it was decided that teaching and learning would become remote the service was transitioned to IT Services. Zoom Pro licensing was implemented through the campus SSO service for all employees and students, while add-on licenses such as Zoom Webinar licenses were initially managed through a unit that handled software licenses purchases for departments, but all Zoom licensing was abruptly turned over to me as it exhausted their resources. The CISO developed security guidelines for Zoom usage and established policies regarding Zoom features that were accepted and published by the vice-chancellor of administrative services. With the retirement of the director in IT Services who owned the Zoom service, its ownership was transitioned to me, making me responsible for all administration. Not being a director myself and having no test lab restricted my ability to safely improve, rework, simplify, or enhance the service, so I focused on practical process improvements, streamlining license fulfillment, and developing fiscal year and quarterly processes to manage licensing and accounts. Requests for new features would be researched and floated to my director or CTO for direction. My attempts to simplify the Zoom environment and reduce the number of sub-accounts was initially supported by my director but in the end stymied by the CISO. I found myself responsible for a complex mission-critical environment used by 60,000 people for thousands of meetings each and every day, but for which I had no authority. Because of its criticality for teaching and learning I took a cautious approach to managing the service, focusing mainly on risk reduction and necessary responses to vendor-driven product changes. This meant that not all feature requests from individuals were accommodated, but as a result of the approach taken there was never an outage caused by my actions or inactions nor any disruption to teaching.

In 2022 a Zoom Phone project was initiated which completed in December 2024, replacing thousands of campus phones with Zoom Phone. As Zoom service owner and administrator I worked closely with our telecommunications unit for all aspects of the project and made adjustments to security, roles, and licensing processes as needed to support that effort and ensure its success. During this time campus departments purchased over 150 Zoom Rooms in support of hybrid teaching and learning, for which I developed the Zoom location hierarchy and a licensing model for Zoom department administrators. During this time I also initiated and chaired a Zoom Events technical interest group that included members from several institutions with the goal of providing feedback to Zoom on features needed in higher education and developing guidance applicable to higher education.

Adobe Creative Cloud

The Adobe Creative Cloud service was implemented just as its service owner retired and it was turned over to me as he walked out the door. The Kivuto service was also implemented at the last minute to manage Adobe licenses and avoid the risk of being charged by Adobe for overages. The director had been meeting with the desktop support and helpdesk managers in IT Services prior to his retirement for transitioning Adobe, but once he left the building those managers refused responsibility. Since he had assigned me administrative rights I became the service manager by default. I dealt with requests from new groups and departments for access and managed all Adobe CC package requests from departments and labs. I continued managing these functions through November 2024 and implemented standard processes for SDL packages in support of campus labs.

And What Didn’t Work

As the CIO’s transformation progressed I considered and applied for several positions in the new organization, hoping that Enterprise Architecture would be reinitiated since with the coming Olympic Games there should be several significant IT projects initiated. However the CIO informed me that the funding for the EA position I applied for was redirected to fund an AI position. I then applied for a number of solution architect positions, IAM positions, and positions related to the collaborative services I had been managing. In total I applied for over a dozen positions. I participated in a number of interviews and panel interviews in pursuit of these positions, while at the same time working closely through 2024 with the newly hired director of collaborative services and business products, advising him as he became accustomed to UCLA and his responsibilities. I continued to believe I had a future at UCLA.

The Last Word

As 2024 comes to a close the director of collaborative services and business products has completed hiring his managers, product owners, developers, and administrators and all of my responsibilities are now being handled by new hires. While I am disappointed in the lack of appreciation for my sacrifices and recognition of my accomplishments, keeping all of these mission critical enterprise services running smoothly under these circumstances for years has been extremely challenging and stressful and it is a relief to have my life back. Now I just need to figure out what I will do with it.

Sunday, August 29, 2021

Leveraging Slack Workflows to Manage Support Requests

The Enterprise Slack Service at UCLA

In the summer of 2020 UCLA IT Services leadership began moving forward on an enterprise implementation of the Slack Grid product for all of UCLA. An assessment by Slack showed that over 2000 free and paid Slack workspaces had been created by people with UCLA email accounts, and so implementing an enterprise service would not only facilitate communications but could provide a path for free an paid workspaces to migrate into the Grid and have the benefits of a paid workspace without bearing the cost.

The project was headed by our CTO and I was assigned the role of solutions architect, primarily providing knowledge of UCLA's environment to the Slack and Deloitte members of the implementation team. By September 2020 we were creating and migrating workspaces in the UCLA Slack Grid in support of the campus community of over 60,000. The project was implemented quickly and anticipating rapid growth we developed and implemented a set of workflows so that support requests could be entered in Slack, worked in Slack, and completed in Slack.  This not only improves efficiency by allowing users to remain in Slack without switching to the telephone or email to make a request, but also leverages useful features of Slack such as channel history and integration with apps, while also providing certain information about the identity of the individual making the request and facilitating any followup DM's (Slack Direct Messages) that may be required to get additional information from the requestor.

Most of the request workflows were created in September 2020 and we have improved them and added a few additional ones as the product has been adopted by the campus. Within the first year we created over 150 workspaces, migrated nearly 100, and have almost 5000 active Slack members. All of this growth has been accomplished without the need to add additional support staff. Thanks in part to the use of the Slack workflows.

I was asked by Slack to participate in a customer panel about Slack and discuss UCLA's use of workflows scheduled for September 1, 2021. This blog post provides more detailed information for those who may have an interest in adapting our model for their own institution.

How Workflows are Used Within UCLA Slack

Workflows are a feature of Slack that allow forms-based entry of information and distribution of information to channels, individuals, or applications. Workflows are attached to channels and can appear in the channel shortcut menu or be automatically initiated when someone first joins the channel. Workflows can also be integrated with Google Sheets so that logs can be kept outside of Slack itself and access to those logs can be granted to people who may not have need to be in the Slack channels themselves.

We chose to put the workflows in a small number of channels so that they would easy for people to find:

Workflows in the UCLA Community workspace #slack-requests channel are for requests from the community to the UCLA community admins or Grid org admins. This is where the majority of workflows can be found. Some of the workflows include:

  • Request workspace creation
  • Request workspace migration
  • Request workspace deletion
  • Request a public channel be set to private
  • Request a channel be archived or deleted
  • Request a channel be unarchived
  • Request a channel be moved
  • Request a workspace be added or removed from a channel
  • Request a channel be shared with an external Slack workspace
  • Request application approval
  • Suggestion Box
  • Report Slack misuse

On joining the #slack-requests channel a workflow initiates which provides information about the workflows and how to initiate them from the channel's shortcut menu.

Workflow when joining slack-requests channel

Workflows in the private slack-admins channel are for requests from workspace administrators or owners to Grid org admins. Examples of some workflows in this channel are:
  • Request workspace owner transfer
  • Grant or Revoke Channel Administrator role

Workflows in the #slack-help and #slack-feedback channels which are available in all workspaces allow suggestion box style entry of feature requests for review as well as reporting Slack misuse to administrators for investigation.

Standard Workflow Structure

All of the workflows follow a common structure so that if someone uses one workflow they will be comfortable with any other workflow. Following a standard approach also shortens the development time required to create new workflows. 

The standard workflow structure is:

  • Provide a brief description of the purpose of the workflow
  • An entry form to collect particulars of the request
  • Route the form information to a private channel for it to be worked
  • Log the form information to a Google Sheets document for external capture and visibility for non-Slack administrators (not all workflows require this)
  • Produce a receipt of the form information for the requestor
  • and, thank the requestor for submitting the request






Workflow Demonstration - Suggestion Box workflow

The suggestion box workflow can be initiated from either the #slack-feedback or the #slack-requests channels.  Here is what running the workflow looks like:



 





Google Sheets Integration

Integrating a workflow with Google Sheets allows the information captured in the form to be shared easily through Google without requiring all interested parties to be members of the private Slack work channels.

Changes made to the Google Sheet after the integration is set up can break the integration however, so it is important that edit access be restricted and that the Sheet include an About tab that explains what can and cannot be done safely.

Google Sheets About tab for the Slack Suggestion Box workflow


Workflow Tips


Here are some suggestions based on my experience developing these workflows.

  • Have at least one collaborator on a workflow in case a change is needed and you are not available
  • When requesting a channel name on a form ask for it twice, once with a pull down and once as free form text. This not only verifies the channel name but if the channel is a private channel the admin may not see the name from the pull down option.
  • There is a maximum of 10 questions on a workflow form. Use them carefully.
  • The length of a workflow name is limited by the menu they are displayed in. So use consistent abbreviations.
  • If a workflow form contains optional questions then if possible make the final question in the form mandatory. This will force the user to scroll to the bottom of the form, seeing all of the questions, before submitting the form.
  • The shortcut menu may not be obvious so use a custom blue lightning bolt icon if possible when referring to it.
  • The Google Sheets integration does not have a way to automatically timestamp entries, so if the date is needed you will need to ask the user to enter it in the form.
  • Always capture information about who is entering the form so that you can easily follow up.
  • Do not share forms back to the channel. There is no reason for anyone in the channel to see a request.
  • Be aware that the order of workflows in the channel shortcuts menu is based on individual user usage, not alphabetical order.
  • If you need the same workflow to be present in multiple channels and write to a Google Sheets log, then set up a different sheet tab for each workflow to write to.
  • Use emojis and set a casual and friendly tone. If people do not feel comfortable using the workflow they will resort to email or the telephone.


Friday, June 28, 2019

Sunsetting my Work at Notre Dame

Sunsetting my Work at Notre Dame

I worked in the Office of Information Technologies at the University of Notre Dame from 1999 to December 2004. As an Information Engineer one of the primary systems I was responsible for designing, implementing, and supporting was the Enterprise Directory Service we named "EDS".  The EDS was essentially an LDAP directory with a schema intentionally designed to support multiple applications for authentication, authorization, and attribute release, as well as providing enhanced user lookup for email clients and directory search web pages. Perl scripts were used to transform and import data from a variety of sources into the EDS and other scripts were used to provide group management and data exports for downstream services.

I've learned that while much of my EDS work has remained available and functioning since implementation almost 20 years ago, the system in its entirety will be replaced by a vendor product this fall. This blog post describes some of that work. For the moment it is still accessible at https://eds.nd.edu.

The EDS home page from 2004 to 2019

In addition to leveraging my own background in data modeling, my LDAP schema design was influenced by the "LDAP Recipe" of Michael Gettes, white papers on identity written by Bob Morgan, the early eduPerson object class work being done by the MACE-Dir working group (which I joined in 2000), "Understanding and Deploying LDAP Directory Services" by T. Howes, M. Smith, and G. Good, and assorted other texts. I was fortunate to have a good working partner in Jeremy McCarty and together we were able to stand up a robust, capable, and reliable service in early 2001 after several months of study, analysis, experimentation, and testing.

Initially the schema attributes that were populated were designed to provide a replacement for Notre Dame email address lookup for the campus and peer institutions. Indexing decisions and ACLs (Attribute Control Lists) were designed carefully to provide just the right information to just the right people or applications, in just the right way. It is fair to say that my design pushed the ACL flexibility of the iPlanet Directory Server to the edge. It would not have been possible with any other product. Later in 2002 attributes were added to support fine-grained authorization using groups, and then in 2003 enhancements were made to support integrations with the campus Microsoft Active Directory Service and SendMail. The final update to the schema documentation was published August 6, 2004 though subsequent schema modifications were made after I left in December of 2004.

From the very beginning though, during early design in 2000, it was necessary to be able to query the EDS and show the data and validate that not only was the data populated correctly for entries, but that the ACL's and indices were properly configured to search and release the correct entries and attributes. To that end I spent time creating a web site at eds.nd.edu to house first a search application and later online documentation and self-service privacy controls and email preferences.

At the time OIT had a team that used Dreamweaver to generate HTML sites and they wanted all OIT departments to utilize their templates so that there would be a common look and feel across our services. I embraced that idea and developed templates that included embedded functions that the Perl cgi program nd_ldap_search.pl would execute when rendering the pages. In this way I could readily leverage the common templates while still providing the customizable interface I wanted without having to constantly rewrite the Perl cgi code that generated the pages. The Perl code could be focused on the functions and the Dreamweaver template could handle the static web page content. All of this required a lot of HTML, Javascript, and Perl work and making sure that each layer was doing its part.
The EDS Search page

The best example of this collaborative architecture is the EDS advanced search page. Most searches performed at the time were based on simple common attributes such as name, affiliation, department, and the unique University NetID which was an AFS id. Notre Dame has always allowed people to have multiple email addresses, so searching by email address was also frequently needed.  The simple search page allowed these searches to be done on the web. The attributes specified for searching in the HTML form would be passed to the nd_ldap_search.pl Perl script which would construct the necessary LDAP search filter, execute the search against the EDS directory, and then construct a response page which could display one or many entries. Simple searches required only simple LDAP filters to be constructed, but the LDAP protocol allows for very complicated filters and so an Advanced Search page was also provided that could take full advantage of LDAP filter potential.




With the Advanced Search page extremely complicated search filters could be created to select entries and a large number of attributes could be returned.

When only a single entry was returned the results page was constructed to resemble a page that would be returned by a standard iPlanet search when using their UI tools.  This was done to minimize user impact if we ever decided to utilize the vendor tools rather than our own (we never did).

Clicking on the "Display Complete List of Attributes" link at the bottom would cause the search to be rerun but return the results as a list.  There were a number of other formats to return the attributes as well, including as raw LDIF. Of course any entries or attributes that were restricted in any way would not be returned. Entries and attributes could be set to public, private, or restricted to the ND network. This was very important in order to ensure compliance with FERPA.




If multiple entries were returned by the search then they would be displayed in a list with minimal attributes displayed in columns and then clicking on an entry in the list would in turn display the single entry in detail as shown above.  This same ability to display entries collectively was extended to support searches against multiple directories so that it was possible to initiate a single search for a person across as many public LDAP directories as desired. This would have been a convenient way to find colleagues at other institutions.

Full documentation was included on the site to aid in accessing the EDS and utilizing its capabilities, including how to create department specific templates and search pages, though this capability was, as far as I know, never used.



There was also a series of pages available only to authenticated users that allowed authenticated searches and email preferences to be specified for spam routing, white listing, black listing an other capabilities supported by Sendmail using sieve. There were also group management pages created so that authorization could be controlled by groups.  Unfortunately all of these pages are no longer accessible to me and I do not know which of those functions continue to be used. A search through my presentations from those years includes information about those capabilities at that time.

When I first interviewed with Notre Dame in 1999 I told the hiring manager, Gary Dobbins, that I wanted to build something that would last. While it is sad to learn that these pages and scripts will finally be retired later this year, I am proud that my work lasted in production nearly 20 years and I doubt that its replacement, no matter how expensive, will fare as well.

Wednesday, June 26, 2019

Recap: November 2017 - June 2019

Supervising Enterprise IT Architecture


From November 2017 to June 2019 I supervised the Enterprise IT Architecture team within UCLA IT Services.  I had participated in team meetings for a year before applying for the supervisor position and I was excited about accepting responsibility for developing a unit again, especially one that had the potential to assist in many IT areas within IT Services and across the campus. As supervisor of IT Architecture I also continued to organize and convene the IT Architecture Steering Committee meetings.

Diagram of the role of the IT Architecture team

To set up shop I established our website at https://spaces.ais.ucla.edu/display/itsvcea/Enterprise+IT+Architecture+Unit and began consolidating and organizing our online content and materials. I encouraged the team to participate with campus workgroups and architecture interest groups such as ITANA while increasing my own involvement in campus groups and programs.  I also attended the Gartner EA Summit in June.

As mentioned in my previous post, a wave of departures began in January 2018 and this directly impacted the EA program. The CIO had been our primary champion and he was the first to leave. Within a span of months we lost several other directors, including my own. By July 2018 the organization was reeling and the departures that had occurred began to affect decision making. At the same time the IT governance in place at UCLA was also being reconsidered. We had been hit by the perfect trifecta of uncertainty - brain drain, restructuring, and governance changes. We continued to operate but some decisions were put on hold waiting for a CIO or permanent CISO to be hired.

Bu summer one of my team had also chosen to leave and I took on the additional role of the UCLA representative for the UC ITAC - IT Architecture Committee. This systemwide group of IT architects focused on the development and review of artifacts to improve the quality of systemwide services.

Logo and catchphrase of the UCLA Green IT Taskforce - GrITT
As the fall of 2018 began I became further engaged in campus-wide activities. I volunteered to participate in the rebirth of the UCLA Green IT Taskforce, a formal task force of the UCLA Sustainability Committee, and was then asked to lead the effort. I was nominated for the campus Management Enrichment Program and completed the UC Professional Skills for Supervisors certification program.  To further the cause of Green IT I co-sponsored a campus Professional Development Project (PDP) team with the goal of polling and interviewing the campus IT leaders regarding Green IT issues. In October I ran an EA birds-of-a-feather session at the Internet2 TechEx conference and in December I presented on EA metrics on the monthly ITANA conference call. I also ran a second birds-of-a-feather EA session at the March Internet2 Global Summit.

Through the period from June 2018 to June 2019 I focused the Architecture team on developing an application inventory process that could be used to support the selection of applications to be migrated to AWS over the next year. We also leveraged the inventory for use with the campus Business Continuity program.

The Enterprise Architecture presentations and activities I led with wider audiences than UCLA during this time period included:
One of the campus activities I decided to participate in to increase my awareness of campus departments was a series of open houses sponsored by various administrative departments held in the spring of 2019.  One of these was held in the Campus Emergency Operations Center and afterwards I decided to pursue a role in the Planning Section for the EOC. This requires FEMA training as well as ongoing participation in campus exercises and EOC activations but it also offers a unique look into various departments of the campus.

As the end of the 2018-2019 fiscal year approaches many of the activities I have been engaged in these last nine months are drawing to a close. In the newly announced organizational structure my role will be focused on AWS migration and I will no longer be supervising EA. The UC ITLC has decided to shut down ITAC and the June Architecture Steering Committee meeting was the last. The PDP team completed its work and I have completed the MEP program. Going forward I will continue to chair the Green IT Taskforce and be engaged in Sustainability activities and CEOG activities. We should have a new CISO and a new CIO later this calendar year and I will have a new supervisor and be reporting to a new director.

It has been an exciting time and more excitement awaits.

Wednesday, October 3, 2018

Recap: July 2016 - Nov 2017

UCPath Implementation


As the MFA project moved into production my role diminished and I was assigned to the UCPath project to serve as the Technical Lead for the IdM workstream.  UCPath is a UC-systemwide Payroll and Human Resources system intended to replace the 30+ year old mainframe PPS (Payroll Personnel System). UCPath is an acronym that stands for UC Payroll, Academic Personnel, Timekeeping, and Human Resources. UCLA ran an instance of PPS for UCLA, UCOP, ASUCLA, and UC Merced, and UCOP ran an instance of PPS for other campuses. The project was announced in October 2009 under UC President Yudof, with PeopleSoft selected in 2011. The Office of the President (UCOP) was the first to go live in December 2015. The next wave - termed the "pilot" - was targeted for August 2017 and was to include the campuses UCLA, UC Riverside, and UC Merced, and the organization ASUCLA.

Because UCOP utilized several of UCLA's Identity related systems, even though UCLA was not on UCPath there were integrations set up between UCPath and the UCLA systems so that as new employees were hired at UCOP they would be assigned their UCLA ID (UID) from the UCLA UID system and that ID communicated to the UCOP IdM system and UCPath. These integrations went live in December of 2015 but did not get much use because of the low volume of employees at UCOP. Also there were concerns that some decisions had been made to meet the UCOP go-live deadline that might require rethinking in order to scale to support UCLA.

While much work had been done to meet UCOP requirements, there remained much to do in order to fully support UCLA. Because of the number of separate test phases - system test, integration test, DR test, QA test - the time allowed for development and developing thorough test cases became increasingly compressed and the time for correction of defects discovered in testing cycles was reduced. Eventually the decision was made to delay the pilot go-live date to December 2017. By July 2017 testing was proceeding well but there remained significant concerns with the quality of the data and other aspects of the project. In early October UCLA announced that it was dropping out of the pilot group and would defer until 2018. ASUCLA, UC Merced, and UC Riverside decided to proceed on schedule, and because ASUCLA and UC Merced were heavily dependent on UCLA Identity services all of the IdM integration work still needed to be completed and implemented on schedule, so the decision for UCLA to defer actually added tasks to our workstream rather than reducing them. By that time, because the IAMUCLA team needed to take ownership for production support and I officially started in the role of IT Architecture supervisor on November 1 my role as IdM workstream lead was turned over to the IAMUCLA lead. I continued working as an SME with the project through the holidays but by the time ASUCLA, UC Merced, and UC Riverside officially went live on January 2, 2018 I was no longer heavily involved.

There was a wave of departures, retirements, and reassignments that started in January 2018 and continued through the summer, including our CIO and several of the IT directors. One can only speculate that the 11th hour decision to defer UCPath for UCLA was a contributing factor in some cases.


UCLA went live on UCPath on September 23, 2018 along with UC Santa Barbara. It is not widely understood that, due in part to the efforts of my team from July 2016 to November 2017, that by that time many of the IdM processes and integrations had been running in production for nearly a year.

And Is That All?


Though the UCPath project did take up the bulk of my time I did have the opportunity to participate in a few other major activities.

Student System Replacement RFP

In 2016 an effort was underway to consider replacing the Student System and I began participating in March 2016. In April I began chairing the RFP Access & Roles Definitions sub-group which developed that section of the formal RFP. The sub-group effort concluded in November though the materials we generated were then used as input for similar sections in the Financial System Replacement RFP which was spinning up around that time.

Financial System Replacement RFP

In late 2016 the Financial System Replacement effort began with the development of an RFP. The sections of the Student System Replacement RFP that my sub-group had developed were leveraged for the Financial System Replacement RFP. My role was primarily to review and edit those sections of the final RFP accordingly.

Enterprise IT Architecture Team

An Enterprise IT Architecture team had been created within IT Services by the Associate Vice Chancellor in 2012. An optimistic and visionary roadmap was developed at that time and training was done, but over the years the director had resigned and the group first lost focus and gradually influence. In 2016 Albert Wu, a senior director in IT Services, took ownership of the team and sought to assess and rebuild. Since I was his IdM Architect he added me into the team. Leadership of the team was rotated quarterly and I served a stint as chair, but my primary focus was to observe and evaluate and consider areas of potential improvement for both the team and IT Services. This lasted through the fall of 2017. At that time I applied for the permanent position as supervisor of the team and officially began in that role on November 1st. I was still engaged with UCPath which required implementation of the IdM interfaces for UCOP, ASUCLA, and UC Merced and I continued to remain involved in that through the holidays.

The EITARCH team during 2017 was focused primarily on establishing the groundwork for future architecture work, with a special focus on Business Architecture and governance processes.

There were many opportunities during this year to establish relationships with individuals from a variety of UCLA departments, many of whom were business process owners. From that perspective it was a rewarding year.

Recap: Oct 2015 - July 2016

UCLA Campus Multi-Factor Authentication


In the Beginning

MFA High-level Decision Diagram - B. Bellina, Nov 2015
Most of my first year at UCLA I was dedicated to the development and implementation of the campus Multi-Factor Authentication solution. Of course MFA was nothing new to Higher Ed and presentations going back to 2011 were readily available (IAMOnline has several). UCLA had adopted hardware tokens at one point in the past, not to great acclaim, and MFA was not seriously considered until 2015 when it was proposed by Albert Wu following a security event. It was then adopted as an Information Security program under the interim director of IT Security Michael Story. At the time I was hired in late 2015 the Duo product for MFA had been selected and the campus Shibboleth Single Sign-On (SSO) implementation was being considered as the first major implementation for the campus to adopt MFA, with other services such as department VPN's to follow. I had practical MFA experience from my time at USC. In 2013 while managing IdM at USC, I presented at the EDUCAUSE Annual conference on the topic and a year before Russell Beall on my team had developed and presented a functional prototype of a groups-driven Shibboleth SSO implementation with Duo for MFA. The prototype was shelved until 2015 when, after an Information Security leadership change, it was resuscitated and my former team at USC, now under Asbed Bedrossian's leadership, quickly productionalized the prototype and implemented MFA with Shibboleth, first rolling it out to 570 IT and critical staff and then expanding to all 15,000 staff on December 1, 2015.

Shibboleth Login MFA flow for UCLA - B. Bellina, Nov 2015
Although some investigative work at UCLA was done in the final quarter of 2015, the official "Multi-Factor Authentication" project kick-off was January 27, 2016. The goal was to have a functional MFA solution developed by Memorial Day running on the Shibboleth Identity Provider v3. Our currently implemented Shibboleth IdP v2 was going to be end-of-lifed in June 2016 along with SAML 1 support so we had to plan the IdP v3 upgrade and migrate all SAML 1 SP's to SAML 2 as well as integrate MFA. After a QA period we would then upgrade the IdP begin controlled and optional enrollment of MFA in May. Considering the complexity of the UCLA environments it was an aggressive schedule but the fixed end-of-life of Shibboleth IdP v2 made this a firm deadline. We held discussions with the IdM teams at both USC and Penn State to discuss their MFA Duo implementations. At that point Penn State was closing in on 10,000 enrolled staff/faculty MFA users with plans to enroll all 25,000 faculty and staff by May. While documenting our requirements we considered the work done elsewhere, all of which had been done for IdP v2: Duo's own Shibboleth IdPv2 plug-in, a Duo MFA on demand IdP v2 solution proposed by Russ Beall here <https://wiki.shibboleth.net/confluence/display/SHIB2/Duo+2FA+On+Demand>, and the IdP v2 Shibboleth Multi-Context Broker. In the end we contracted with Unicon and John Gasper worked with us to develop our own custom IdP v3 logic that leveraged the standard eduPersonEntitlement attribute which we populated from Grouper groups. It was tight but the compiled code was turned over by Memorial Day and even included the hooks needed to support the AuthnContextClassRef profile for MFA being proposed by InCommon.

B. Bellina, Feb 2016

Talking the Talk

Along the way, in addition to developing technical requirements, working with Unicon, and getting the test and QA environments set up, there needed to be continuous communication to campus groups to belay fears, avoid misunderstandings, develop partnerships, and request feedback. Often these communications were in the form of presentations and because of my experience in that area I had several opportunities in the months between January and June to both contribute to presentations and give them. It was for one of the early presentations that I applied the tagline "As Simple as 1... 2... 3" which in one form or another continued to be used in UCLA MFA literature for years afterward.

A partial list of presentations given in 2016 at UCLA include:
  • Jan 12 - IAMUCLA Townhall
  • Feb 17 - UCLA Info Sec Ask the Experts "Multi-Factor Authentication"
  • Mar 8 - IAMUCLA Townhall
  • Mar 22 - Common Systems Group meeting - "Multi-Factor Authentication"
  • Apr 14 - DIIT Spring Staff Meeting "IT Services Information Security Program"
  • Apr 26 - Common Systems Group meeting - "Deploying Multi-Factor Authentication with UCLA Logon"
  • Apr 27 - BruinTech Tech-a-thon "Multi-Factor Authentication: The UCLA Campus Service"
  • May 3 - IAMUCLA Townhall
  • Jun 27 - DACSS Training half-day seminar "Multi-Factor Authentication"
  • Nov 17 - BruinTech Brown Bag "Multi-Factor Authentication"
There were also opportunities in 2016 and even 2017 to present to larger audiences as well at UC, InCommon, and Internet2 events including:

Walking the Walk

The MFA production rollout began on time in June 2016 with 9 enrolled users on June 6 performing 40 MFA-enabled logons a day. By June 29 there were 129 enrolled users performing over 700 MFA-enabled logons per day. My time on the project ended in July but the rollout continued to the campus community and eventually enhancements were done to the enrollment user interface.
  • By end of 2016 there were over 700 users enrolled.
  • By end of June 2017 there were over 2,500 users enrolled.
  • October 31, 2017 all non-medical faculty and staff are required to use MFA to access campus applications through SSO and campus VPN, increasing enrollment to over 26,000.
  • By end of 2017 there were over 31,000 users enrolled performing over 50,000 MFA-enabled logons per day.
  • April 17, 2018 all students are required to use MFA to access campus applications through SSO and campus VPN, increasing enrollment to over 71,000 and performing over 100,000 MFA-enabled logons per day.
Because all incoming UCLA employees and students are now mandated to use MFA in order to access web applications, including the new payroll system, the numbers continue to rise. At this time UCLA is undoubtedly one of the largest Duo MFA implementations of any university in the United States.

This was a project I am proud to have been a part of and I remain grateful to Albert Wu for giving me the opportunity.

Tuesday, January 12, 2016

Presentation Jitters

Giving my first UCLA presentation this afternoon on SSO and MFA.  Besides lack of familiarity with the audience or venue, I am also a little nervous that it has been over 2 years since I delivered my last presentation.  I have written presentations in the last couple of years but they were never delivered beyond my immediate management for various reasons. It is hard to believe that it has been so long though.

While I have delivered over 100 presentations in the last 15 years I found it never really gets old. There is always nervousness, although I tend to lose this quickly once I am speaking and thinking on my feet (sometimes in one order, sometimes the other). There is always the desire to tweak the slides one more time.  And I have to fight my desire to add content on the fly and try to remember the "when in doubt, throw it out" motto.  Presentations are best when they introduce concepts and pique curiosity rather than attempting to educate the audience. With a very limited amount of time and a slidedeck it just isn't possible to really educate to any depth and when diving too deep you always leave the audience gasping for air.

Because this is an internal presentation I will not be able to post it, but some of it may appear in later presentations at Internet2 or EDUCAUSE.