Subscribe to Events Feeds

217 Results Found

How FDA Trains its Investigators to Review CAPA and What You Should Do to Prepare
8/27/2014 10:00 AM - 11:00 AM
Online Event Fremont, California United States
Event Listing
Summary:

During an inspection, FDA personnel will take a great deal of time reviewing your company's CAPA system. What will they look for? This session will discuss all the documents used by FDA to train their inspectors to review your CAPA system, some of which you may not be familiar with.

Best Practices for Governance of Enterprise Reference Data Management
8/27/2014 10:00 AM - 11:15 AM
Online seminar Fremont, California United States
Event Listing
Summary:

Overview:

Experience shows that upwards of 80% of reference data may be used by multiple applications. Clearly, common reference data should be considered "enterprise data" and governed through the Data Governance group. Common reference data often crosses subject area lines and therefore difficult to identify which Data Steward may be responsible for managing it. Moreover, reference data can often be seen as an unwanted stepchild and left behind for applications to manage. However, many of the organization’s metrics and regulatory reporting may now depend upon common reference data.

Why should you attend: Reference Data Management is a valuable tool for the Data Governance team to enable quality and consistency across the enterprise. Often reference data is required to be managed consistently across disparate operational applications in order to effectively meet regulatory and governmental reporting. Inconsistencies and inaccurate aggregations can lead to significant fines, poor public publicity, and negative public reactions leading short term stock loses.

Areas Covered in the Session:

Identifying what is "common reference data" & which Data Stewards/SMEs are the trustees when reference data crosses Lines of Business.

Dealing with regulatory bodies & leveraging governmental agencies as sources of reference data

Creating new ways for informal governance of reference data while leveraging "natural entry points" to take advantage of business events in the business

Who Will Benefit:

Chief Data Officer

Director - Data Architecture

Director-Data Governance

Enterprise Solution Architect

Data Architect

Business Analyst

Getting The Most From AgileScrum Building a Great Product Increasing Efficiency And Quality Redu
8/21/2014 10:00 AM - 11:30 AM
Online seminar Fremont, California United States
Event Listing
Summary:

Overview: Agile methodologies promote a project-management process that encourages frequent inspection and adaptation, and a leadership philosophy using teamwork, self-organization and accountability. Implementation of Scrum across industry runs the gamut from organizations that use all of the practices to organizations that only use a few of them. This means that just declaring, "We use Scrum," or "We are Agile," does not guarantee any specific level of Agile usage or quality.

This is a significant concern in a critical system environment (IT and safety-critical applications). However, Scrum can be used to develop critical software systems, but additional engineering and management practices need to be considered to ensure that robust products are developed and risk and inefficiencies are minimized. In this presentation we will explain how Scrum can be used in any organization and additions that can clarify customer needs, manage project risks and reduce inefficiencies while maintaining the benefits of Scrum.

Areas Covered in the Session:

Over-zealous use of the agile manifesto

Having a good enough backlog - quality and review

Do you have end-user involvement?

No release plan, only scoping for one iteration at a time

Embracing change - when is too much?

Ensuring that everything that needs to be done (e.g., design and test) can be done in an iteration

Managing Dependencies

Managing surprises, managing risk

Refactoring and code ownership isn't free

Self-organizing teams - can they self-organize?

Scrum masters don't like the things they have to do

Be careful with the "Pigs and Chickens" terms

What is design and where does it fit?

Who Will Benefit:

Senior Managers wanting to use Scrum and manage quality and risk

Project or program managers leading teams using Scrum

Scrum Masters / Coaches

Internal company Project Management Office (PMO) leaders and members

Internal process improvement coaches tasked with improving the organization's cost, schedule, quality performance and want to implement Scrum.

Holistic Operational Security Bringing Application Server and Network Security Together
8/27/2014 10:00 AM - 11:30 AM
Online seminar Fremont, California United States
Event Listing
Summary:

Overview: This presentation will be An examination of a modern rails application reviewing application security best practices going through specific rails controls for the application, best practices in deploying and how to integrate application controls, local host and network firewall controls into a self-monitoring, alerting and automated security system.

all techniques and tools reviewed are open source and not only freely available but strongly encouraged to use. a short list of technologies that will be reviewed include: rails, nginx, naxsi, rack-attack, brakeman, syslog, fail2ban, ossec and more..

Why should you Attend: Do you have applications on the internet? Have you security the application in addition to the server and network it runs on? Do all components talk together to provide security for the application and your data? This presentation will examine a Ruby on Rails application with integrated security controls and show how to integrate into a holistic operational security system that protects against and responds to threats to the system.

Areas Covered in the Session:

Application security best practices

Server / network security best practices

Integrating server / network and application security into holistic security system

Who Will Benefit:

Application Designers

Application Programmers

Security Engineers

System Administrators

Minimal IT/Security management

Public Clouds Myths and Realities
8/28/2014 10:00 AM - 11:30 AM
Online seminar Fremont, California United States
Event Listing
Summary:

Overview: To correctly understand the benefits and issues surrounding public clouds, we first need to go back to the definition of "cloud." Different organizations (like NIST), vendors, and consultants have produced definitions that are often too verbose or address only one key characteristic of the cloud. It is important to identify the key concepts that differentiate the cloud from prior ways to procure and deliver computing resources.

It is useful to examine the hype surrounding the concept, propagated by a number of public cloud providers. While this is often anecdotal, this discussion helps customers challenge salespeople who recite marketing points without substance. It is also quite educational to see how certain vendors have changed their story as the new model emerged.

The next thing potential adopters need to know is what the categories of issues that cause all the fears about cloud computing are. Some of these fears are exaggerated: for example, availability issues, while they create a lot of adverse publicity, are not in fact as serious as people often fear, and there are good reasons for that. On the other hand, other issues are actually often understated (e.g., the problem of data residency in applications that manipulate data associated with some form of national interest or in highly regulated industry sectors).

After considering both extremes of this ongoing debate, a customer needs to develop a balanced vire - not only of the technology, but also of the sourcing and governance issues that can make a project fail. In support of this evaluation, it helps to understand the evolution of multi-tenant computing solutions from the start of timesharing 50 years ago, to the current types of offerings. There are common principles, but the cloud does bring something genuinely new compared to the initial IBM offerings of the 1960s. It is also important to read or listen to case studies, of which there are now a good number. Some of them are public, while others are shared in conferences and consortia. Vendor-published stories should be considered suspect. Finally, it is also important to understand the full scope of services that can be procured in the cloud: it is not just CRM applications, Web sites, or disk space, but it includes many more types of communication and collaboration capabilities, and this provides an opportunity to "start small" and get familiar with the issues while starting to save some money and decrease cycle times.

Once all this is understood, an organization needs to proceed in a pragmatic manner. Key steps of this journey have been documented in particular by the Cloud Standards Customer Council, which has published them in three successive guides, including a 9-step "Practical Guide to Cloud Computing" and two guides related to Cloud Service Agreements.

Why should you attend: Since the Cloud Computing model started taking off around 2007, the vendors have been promising miraculous benefits, and the naysayers have been raising the specter of major disruptions to performance and security.

Without a neutral source of information and a balanced perspective on the advantages and risks of public clouds, CIOs and IT sourcing managers will easily make the wtong decisions. If you adopt a cloud solution that is not well-suited to your needs, under a Service Level Agreement that is biased toward the vendor, you could indeed be faced with disruptions about which you will have ver little control. On the other hand, if you do not adopt any cloud solutions out of fear, your competitors may become more agile and your customers or users will ultimately find that you cannot deploy new capabilities fast enough. You will also continue to pay upfront for a fixed amount of software or hardware that you need to amortize over several years, instead of taking advantage of the cloud’s elasticity and its ability to replace capital investments with operating expenses.

The emergence of the cloud also changes the relationship between IT and the business. If IT cannot deliver new services fast enough, the business will start procuring those services on its own in the cloud. Ultimately, this can make IT irrelevant, and will lead the business to make poor choices in terms of redundant solutions, lack of integration, uneven security, etc. The business needs to understand what’s at stake, and IT needs to use clouds wisely in order to maintain its role as a partner to the business.

Areas Covered in the Session:

Introduction

Back to Basics: Defining "Cloud"

The Hype

The Fears

The Reality

Cloud Computing Use Cases

Cloud Computing, or "Cloud Whatever"

Pragmatics

Service Level Agreements

Conclusions

Who Will Benefit:

CIO

IT Manager (reports to CIO)

CFO

Sourcing Manager

Cloud Providers

Senior IT Consultants

Public Clouds Myths and Realities
8/28/2014 10:00 AM - 11:30 AM
Online seminar Fremont, California United States
Event Listing
Summary:

Overview: To correctly understand the benefits and issues surrounding public clouds, we first need to go back to the definition of "cloud." Different organizations (like NIST), vendors, and consultants have produced definitions that are often too verbose or address only one key characteristic of the cloud. It is important to identify the key concepts that differentiate the cloud from prior ways to procure and deliver computing resources.

It is useful to examine the hype surrounding the concept, propagated by a number of public cloud providers. While this is often anecdotal, this discussion helps customers challenge salespeople who recite marketing points without substance. It is also quite educational to see how certain vendors have changed their story as the new model emerged.

The next thing potential adopters need to know is what the categories of issues that cause all the fears about cloud computing are. Some of these fears are exaggerated: for example, availability issues, while they create a lot of adverse publicity, are not in fact as serious as people often fear, and there are good reasons for that. On the other hand, other issues are actually often understated (e.g., the problem of data residency in applications that manipulate data associated with some form of national interest or in highly regulated industry sectors).

After considering both extremes of this ongoing debate, a customer needs to develop a balanced vire - not only of the technology, but also of the sourcing and governance issues that can make a project fail. In support of this evaluation, it helps to understand the evolution of multi-tenant computing solutions from the start of timesharing 50 years ago, to the current types of offerings. There are common principles, but the cloud does bring something genuinely new compared to the initial IBM offerings of the 1960s. It is also important to read or listen to case studies, of which there are now a good number. Some of them are public, while others are shared in conferences and consortia. Vendor-published stories should be considered suspect. Finally, it is also important to understand the full scope of services that can be procured in the cloud: it is not just CRM applications, Web sites, or disk space, but it includes many more types of communication and collaboration capabilities, and this provides an opportunity to "start small" and get familiar with the issues while starting to save some money and decrease cycle times.

Once all this is understood, an organization needs to proceed in a pragmatic manner. Key steps of this journey have been documented in particular by the Cloud Standards Customer Council, which has published them in three successive guides, including a 9-step "Practical Guide to Cloud Computing" and two guides related to Cloud Service Agreements.

Why should you attend: Since the Cloud Computing model started taking off around 2007, the vendors have been promising miraculous benefits, and the naysayers have been raising the specter of major disruptions to performance and security.

Without a neutral source of information and a balanced perspective on the advantages and risks of public clouds, CIOs and IT sourcing managers will easily make the wtong decisions. If you adopt a cloud solution that is not well-suited to your needs, under a Service Level Agreement that is biased toward the vendor, you could indeed be faced with disruptions about which you will have ver little control. On the other hand, if you do not adopt any cloud solutions out of fear, your competitors may become more agile and your customers or users will ultimately find that you cannot deploy new capabilities fast enough. You will also continue to pay upfront for a fixed amount of software or hardware that you need to amortize over several years, instead of taking advantage of the cloud’s elasticity and its ability to replace capital investments with operating expenses.

The emergence of the cloud also changes the relationship between IT and the business. If IT cannot deliver new services fast enough, the business will start procuring those services on its own in the cloud. Ultimately, this can make IT irrelevant, and will lead the business to make poor choices in terms of redundant solutions, lack of integration, uneven security, etc. The business needs to understand what’s at stake, and IT needs to use clouds wisely in order to maintain its role as a partner to the business.

Areas Covered in the Session:

Introduction

Back to Basics: Defining "Cloud"

The Hype

The Fears

The Reality

Cloud Computing Use Cases

Cloud Computing, or "Cloud Whatever"

Pragmatics

Service Level Agreements

Conclusions

Who Will Benefit:

CIO

IT Manager (reports to CIO)

CFO

Sourcing Manager

Cloud Providers

Senior IT Consultants

Understanding Your Data Before Building Predictive Models
8/26/2014 10:00 AM - 11:15 AM
Online seminar Fremont, California United States
Event Listing
Summary:

Overview: Data-driven decisions from big data, data science, and predictive analytics has quickly become among the most important areas of growth in organizations. Skills in these areas are specialized and often lacking in traditional education.

Effective predictive modeling does not require a PhD in mathematics, statistics, or hard science fields to do well. Many effective and even famous data miners and predictive modelers have BS or BA degrees in non-technical fields. However, it does require a qualitative understanding of what the predictive modeling process s about, what algorithms do, what their limitations are, how to change their behavior, and what kind of data is necessary for building predictive models.

Even individuals with experience in analytics understand that predictive modeling requires not only an understand of the science, but also decisions throughout a modeling project that are not (indeed cannot be) governed fully by the science; there is "art" and tradeoffs we as analysts make at every stage. These aren’t guesses, but are rather governed by the principles of predictive modeling: sampling, data distributions and their effects on summary statistics and the modeling algorithms, and how to determine if a model is good or not.

Areas Covered in the Session:

CRISP-DM - what are the main steps in the predictive modeling process

Key steps in defining modeling objectives

The most important principles in setting up data for modeling

Brief overview of key modeling algorithms

Matching model accuracy to business objectives

Who Will Benefit:

Data Scientists

Big Data Analysts

IT Professionals

Project Leaders

Business Analysts

Functional Analytic Practitioners

Anyone Overwhelmed with Data

6 Hour Virtual Seminar on Business Architecture Webinar By EITAGlobal
9/11/2014 8:00 AM - 11:00 AM
Online seminar Fremont, California United States
Event Listing
Summary:

Overview: This workshop provides participants with a toolset for developing a business architecture that goes well beyond the technology-oriented approaches typically employed by EA. Instead participants are shown how to apply a 5-level framework for modeling any organization, from the top down to the user and enabling technology.

This multi-layered approach makes visible all of the key processing sub-systems and shows how critical business drivers, such as profitability, cost, and quality can be linked to drive performance at all levels. This course expounds on the core concepts described in the book Rediscovering Value: Leading the 3-D Enterprise to Sustainable Success, co-authored by the presenter

Why should you attend: Enterprise Architecture (EA) is an entire discipline within IT, devoted to defining the major components of organizations to which technology is a supporting set of products and services. But EA experts are confused and disagreed about what an enterprise architecture is, how to depict one, and what purpose it serves in guiding technology development.

There is no one model or approach to EA. The Zachman Framework has been referenced again and again in numerous models but it’s never clear what the relationship is to a given EA and Zachman’s original work, nor is there any reason to consider the Zachman model to be correct or superior to anything else. EA is a discipline without any clear standards.

Areas Covered in the Session:

Distinguish between levels of business architecture

Apply a hierarchical structure to the depiction of business processing sub-systems and their linkages between business objectives and work objectives

Identify and practice with modeling tools proven to be most effective for each level of a business architecture

Distinguish between approaches to work system boundary setting that do and don’t support an effective business architecture model

Who Will Benefit:

Business architects

BPM practitioners

Enterprise Architects

Business process modelers

Measurement specialists

Speaker Profile:

Alan Ramias is a Partner of the Performance Design Lab (PDL). PDL is a consulting and training organization with decades of experience in applying BPM and performance improvement. The founder of PDL was the late Dr. Geary Rummler who co-authored the book Improving Performance: How to Manage the White Space on the Organization Chart, which helped trigger the process improvement/reengineering movement. Alan and his partners continue to evolve and expand the theory base and methodologies introduced in Improving Performance to include breakthrough approaches to management systems, measurement, strategy, and organization structure design and implementation. Alan has consulted with dozens of companies on performance management and measurement, helping to install effective, practical process management and measurement systems.

Business Glossary First step to Data Governance Success Webinar By EITAGlobal
9/17/2014 10:00 AM - 1:00 PM
Online seminar Fremont, California United States
Event Listing
Summary:

Overview: Often organizations are challenged in selecting the first enterprise wide implementation that the Data Governance (DG) program should address. A Business Glossary is a great first initiative as well as one that provides significant value to the enterprise.

A Business Glossary is the tool for exposing authoritative content from DG initiatives and used to communicate understanding and clarity across the enterprise. A business glossary can connect workers across the enterprise to critical business information they can trust, helping to eliminate misunderstandings that cause lost time, lost opportunities and lost revenue. Creating great definitions and business term names will aid significantly in enabling search capabilities. The Business Glossary should be an early deliverable from your DG program and mature to include the logical and physical data constructs as a valuable component to drive DG maturity and value.

This seminar will be helpful for data management and Governance professionals that have been challenged with any of the following issues:

How to organize the business glossary program for quick wins as well as position for a maturing DG program

Business Intelligence, BPM and KPI ambiguity

Legal and compliance regulations are driving new projects that are not clearly defined (such as TART)

Enterprise or international projects like CDI/MDM that must address terminology and semantic differences across the enterprise

Why should you Attend: Often organizations are challenged in selecting the first enterprise wide implementation that the Data Governance (DG) program should address. A Business Glossary is a great first initiative as well as one that provides significant value to the enterprise. A Business Glossary activity, when addresses as data sprints, can provide the DG program with effective and quick success, as well as great visibility for the program.

Good Governance processes are critical for many enterprises that are:

Unclear how to achieve quick results from Data Governance

Organized geographically introducing global enterprise semantic differences

Challenged by complexity of terminology having many glossaries

Mergers and Acquisitions have introduce semantic differences across the enterprise

Areas Covered in the Session:

Methods for establishing the Business Glossary, standards and best practices

How to leverage your existing Governance team and processes

How to create structured definition standards and name business terms

Techniques to get your Glossary populated and used

Who Will Benefit:

CDO and Director of Data Governance

Data Stewards

Data Governance Analysts and Data Governance team members

Data Architects

Speaker Profile:

Lowell is recognized as a thought leader in business metadata/glossaries, enterprise application integration, DW/BI applications, and Data Governance having hands-on experience with over 80 business intelligence implementations. He has also been recognized by W. H. (Bill) Inmon, and is a contributor to six of his books. Mr. Fryman is a co-author of the book “Business Metadata: Capturing Enterprise Knowledge”. Lowell an Data Architect in the Healthcare Practice of Edgewater Technology Inc. Lowell has achieved a number of certifications including Certified Business Intelligence Professional (CBIP) for ICCP/TDWI and has an MSIT degree.

Dynamic APIs and Dynamic Schemas The Secrets of Building Inherently Flexible Software
9/23/2014 10:00 AM - 11:30 AM
Online seminar Fremont, California United States
Event Listing
Summary:

Overview: The central technical challenge for Agile Architecture is how to achieve functionality and performance without having to trade off flexibility. The context for these central patterns of Agile Architecture is the concept of architecting at a dynamic level of abstraction above the logical level of contracted APIs and data schemas.

At this dynamic level, there are the central patterns that are essential to resolving the fundamental compromise of distributed computing:

Dynamic Coupling. Tightly coupled interfaces require detailed knowledge of both sides of a distributed computing interaction, and any change on one side might break the other. Contracted interfaces introduce loose coupling, but at the expense of a static interface. With dynamic coupling, interface differences are resolved dynamically at run time.

Dynamic Schemas. Neither the WSDL files that specify Web Services, nor the URIs, HTTP verbs, and Internet Media Types that specify RESTful APIs adequately contract the message semantics for any interaction. Dynamic schemas abstract all semantic metadata in a consistent way, relying once again upon the integration engine to resolve these dynamic schemas for each interaction at run time.

Extreme Late Binding. SOA registries ended up doing little more than resolving endpoint references at run time, similar to the way DNS resolves domain names - in other words, they provided late binding. Such late binding adds some flexibility to an interaction, but typically at the expense of performance. Today, however, dynamic coupling and dynamic schemas enable any client to discover at run time all the metadata it requires to interact with any endpoint, without sacrificing performance - what we call extreme late binding.

Put these architectural principles together and you have an approach for building inherently flexible software, even in a complicated distributed computing environment.

Why should you attend: The central challenge of distributed computing is how to get your various distributed bits to communicate with each other properly. Since those distributed components are typically heterogeneous, we must somehow come up with a common means of establishing interaction among components everybody can agree on. Yet, once we do that, we've necessarily compromised on flexibility, because changing how our components interact is a difficult, complex endeavor. This problem pervades the entire history of APIs, from remote procedure calls to Web Services to RESTful APIs and everything in between. We must somehow contract interfaces in order to abstract the underlying functionality, yet the very act of introducing such contracts is a compromise, since the interface itself now lacks flexibility.

Areas Covered in the Session:

Review of Web Services and RESTful APIs

Limitations of contracted interfaces

Challenges of document style services

Challenges of custom media types

Meta, Dynamic, and Logical abstractions

Data, metadata, and code at the Meta level

Working with abstract models

The role of the business agility platform

Agent-Oriented Architecture

Capabilities vs. Affordances

Implementing dynamic coupling

Implementing dynamic schemas

The role of extreme late binding

Who Will Benefit:

Enterprise Architects

Integration Architects

Software architects

Integration engineers

SOA specialists

Software developers

System Architects

Solution Architects

IT managers

Speaker Profile:

Jason Bloomberg is the leading expert on Digital Transformation through architecting agility for the enterprise. As president of Intellyx, Mr. Bloomberg brings his years of thought leadership in the areas of Cloud Computing, Enterprise Architecture, and Service-Oriented Architecture to a global clientele of business executives, architects, software vendors, and Cloud service providers looking to achieve technology-enabled business agility across their organizations and for their customers. His latest book, The Agile Architecture Revolution (John Wiley & Sons, 2013), sets the stage for Mr. Bloomberg’s groundbreaking Agile Architecture vision.

Events Map