Welcome to Modeling Agencies | Ny Modeling Agency | Fashion Modeling Agencies


Tuesday, November 27, 2007

Maikish making it happen at Ground Zero

The acrimony between the World Trade Center site's stakeholders may have been ameliorated and the years long vice-like grip of inaction broken, but the lofty challenge it has been to rebuild Ground Zero has in many ways just begun for Charles Maikish, the downtown construction czar whose role it is to coordinate the different parties and projects at the site in order to assure that its buildout proceeds on schedule. In addition to its massive scale, adding to the complexity of the redevelopment is that there is a collection of stakeholders involved in the many projects going on at the site rather than a single over-arching authority.

But while it may seem the ultimate bureaucratic nightmare for some public officials, a unique level of cooperation and communication among the parties at the site has produced progress and efficiency.

"You can look out the window and see that things are going on at the site, you can hear the construction, it has really begun, which is exciting," Maikish said, peering from the window of his office in One Liberty Street, which offers views of the site.

The need for coordination is facilitated by the Construction Command Center, which centralizes the decision-making processes involved in staging and executing the various projects via weekly meetings attended by the site's stakeholders.

Although Larry Silverstein and the Port Authority are the two names that probably come to mind for most when they think about Ground Zero, Maikish listed many more agencies, including the MTA and Department of Transportation, who have a hand in how construction will proceed at the site.

On top of that, the resources at the site that most construction relies on, such as staging areas and roadways for bringing in cement and materials, are in such demand, the command center has had to carefully divide their usage among the many construction crews while being careful not to place restrictions that will damage the various timelines that each project is operating under.

"There's so much going on at the site all at once, you can see how that can lead to log jams," Maikish said. "You lay out in a sequential manner when and for how long what activities are going to happen, it's called 4D modeling.

"You make a 3D model of the construction site and everything that's going to happen and then you add to that the fourth element, which is the passage of time."

What is becoming an increasingly controversial example of how interlocked each project's progress is to progress for the whole site is the delayed deconstruction of 130 Liberty Street.

Public officials had vowed the building, the former New York headquarters of Deutsche Bank, would be taken down years ago.

But aside from scaffolding and a black tarp that has enveloped the building's facade from top to bottom, seemingly little more has been done to take the building down.

Further delays could affect the construction of the entranceway to the network of underground roads that will tunnel under the WTC site.

Those roads are essential to the function of the building's planned at Ground Zero and will be used also for the site's construction.

Service mitigates issuer risk in regulation AB

Recon data management and transaction reconciliation services provide third-party, independent comparison and analysis of collateral, as well as bond values and investor calculations. Suited for issuers with multiple servicing systems and/or master and subservicer relationships, it also reduces the chance of generating 10-D reports that are inconsistent with Reg AB's Item 1105 Static Pool Information.

WALTHAM, Mass., Dec. 19 / - Lewtan Technologies, Inc., provider of asset-backed securities surveillance, analytics, and content for the global securitization industry, today introduced Recon, a set of data-management and transaction-reconciliation services for mitigating issuer risk in the new era of Regulation AB. Recon is especially appealing to issuers with multiple servicing systems and/or master and subservicer relationships, and the services are a real boon to organizations with existing data-integrity issues or operational voids created in the transition to Reg AB. Beyond compliance, Recon helps issuers aggregate performance data to use in showcasing offerings to investors.

"In the new world of Reg AB, issuers face increased liability for the material accuracy of their securitized-transaction data - including a great deal of data that they don't originate or control themselves," said Paul Arvidson, vice president, Lewtan. "The situation is further complicated by disagreements about how Item 1122 servicer attestation should be carried out. Leveraging decades of experience and its unrivaled data warehouse, Lewtan created Recon to help issuers comply with regulations, mitigate risk, and illustrate past performance with historical data."

Mitigate Risk and Minimize Cost

Recon provides third-party, independent comparison and analysis of the collateral supporting a deal, as well as bond values and investor calculations. This is invaluable to issuers, who, since January 2006, have faced an increased liability standard for static-pool data on deals. Recon also greatly reduces the chance that an organization will generate a 10-D report inconsistent with Reg AB's Item 1105 Static Pool Information.

Firms who engage in the evolving best practice of continuous auditing can use Recon to automatically evaluate the veracity of their current remittance reports prior to the subsequent remittance period. The challenge of balancing accuracy and comprehensiveness against the cost of this validation can be daunting for firms auditing data on a random basis using manually intensive spot-checking; Recon offers a simple, cost-effective solution.

Recon also reduces the overall auditing expense required for Sarbanes- Oxley and Reg AB compliance by leveraging Lewtan's completely up-to-date 80,000+ bond-data warehouse (remittance-report data that Lewtan already collects and systematically checks), thus eliminating the need for issuers to build duplicate warehouses. By systemically automating data checks, Recon virtually eliminates human error. Leveraging Lewtan's data infrastructure in lieu of ad hoc data auditing is generally the most economical approach, as well as the fastest to implement.

Monday, November 26, 2007

A model endeavor

Traffic analysts today are faced with evaluating diverse and complex solutions to address congestion in transportation systems. Instead of "simply" deciding how many lanes to design for a new freeway or how long the turn bays should be at a traffic signal, practitioners now are analyzing advanced traffic signal and ramp metering systems, for example, and complex weaving and geometric configurations, intelligent transportation system strategies, multimodal corridor management plans, and congestion pricing strategies.

Traffic microsimulation analysis tools can help evaluate these complex solutions by modeling real-world transportation networks on a systemwide scale that is difficult with more traditional methods. Dramatic improvements in computer processing speeds and capabilities in the past decade have enabled traffic microsimulation software to model increasingly complex and larger scale transportation systems. As a result, microsimulation is quickly becoming popular among traffic analysts and is playing an important role in transportation investment decisions.

The Federal Highway Administration (FHWA) is a leader in developing traffic microsimulation models, dating back to the development of NETwork SIMulation (NETSIM) in the 1970s, FREeway SIMulation (FRESIM) in the 1980s, and the merging of NETSIM and FRESIM into a single CORridor SIMulation (CORSIM) model, all of which was integrated into the Traffic Software Integrated System (TSIS) package in the 1990s. In the early 1990s, TSIS/CORSIM was the only viable traffic microsimulation model available to practitioners. By the late 1990s, however, a number of commercial vendors began offering their own versions of traffic microsimulation packages to meet the growing demand. Today, the popularity of microsimulation packages continues to increase, and there is now a viable market for commercial traffic simulation vendors.

In the early 2000s, FHWA reevaluated its future role in the traffic simulation market. A survey of traffic practitioners and existing traffic simulation packages revealed that while most of the software packages, although robust and providing a range of analysis options, still have some intrinsic limitations that can affect the performance and accuracy of the model results. These limitations in the functionality of current microsimulation systems have generated questions in the transportation community. For example, simulation users view many microsimulation software packages as "black boxes" in that users are not sure how model outputs are calculated and, as a result, are not confident in the accuracy and validity of the model results.

As a result of the market assessment, FHWA decided to take a different role in the traffic simulation market. Rather than compete with the commercial simulation vendors by continuing to develop TSIS/CORSIM, FHWA would act in a "market facilitator" role by focusing public resources on fostering an environment of public-private coordination through research products that will benefit the entire traffic simulation community: practitioners, vendors, and researchers.

Enter the NGSIM Program

With the goal of improving the quality and use of traffic microsimulation tools to facilitate transportation decisionmaking, FHWA's Traffic Analysis Tools Program began the Next Generation Simulation (NGSIM) program in 2002. NGSIM is a unique public-private partnership between FHWA, transportation consulting companies, university researchers, and foreign and domestic commercial microsimulation software developers.

The objective of the program is to develop a core of driver behavior algorithms that represent the fundamental logic in traffic microsimulation models, with supporting documentation and validation datasets. NGSIM products will be well documented, openly distributed, and free to the transportation community through the NGSIM Web site (www.ngsim.fhwa.dot.gov).

"The NGSIM program represents a model public-private partnership that has yielded demonstrable benefits for both sectors," says Nagui Rouphail, chairman of the NGSIM stakeholder traffic modelers group and director of the Institute for Transportation Research and Education at North Carolina State University. He adds, "Here the [U.S.] Government acts as the catalyst for developing sound science and the data to back it up, while the private sector commits to participate in the development process as well as incorporating the research findings into its commercial software. This process ensures wider dissemination of the research results and even wider acceptance of the underlying science."

The NGSIM team is composed of traffic simulation and modeling experts managed by a private company. The team is supported by senior advisers from respected transportation institutions across the Nation and includes three stakeholder groups: a traffic modelers group that represents researchers and others who develop driver behavior models, a software developers group of private vendors responsible for developing and maintaining commercial traffic simulation software, and a model users group that represents the practitioners who use traffic simulation models for decisionmaking.

Courtroom alchemy: adequacy advocates turn guesstimates into gold

Beginning in the late 1960s, and accelerating unabated through to the present, plaintiffs have filed more than 125 court cases questioning the constitutionality of school district and school spending levels. In 2005 alone, high-court decisions were handed down in eight states, including Kansas and Texas, with a decision rendered in South Carolina that has national implications.

Cases in seven more states, including Kentucky, Louisiana, Missouri, and Nebraska, are now pending decisions concerning issues of adequacy in state funding mechanisms. And legal challenges to state funding mechanisms are not one-off endeavors. Arizona, California, Connecticut, Kansas, New Hampshire, New Jersey, New York, Ohio, Pennsylvania, Texas, and Wyoming are states in which there have been not one, not two, but as many as five or six legal challenges to legislatively determined spending levels.

Much of the litigation, particularly early on, centered on the issue of funding equity. As of 2005, funding mechanisms in 36 states had been challenged on the grounds that interdistrict spending was inequitable. Increasingly, however, cases have focused instead on the overall amount, or adequacy, of funding. Beginning in the 1990s, enactment in virtually every state of learning objectives and curriculum standards provided a new reference point for plaintiffs arguing that funding was inadequate overall. By 2006, the constitutionality of funding mechanisms in 39 states had been challenged on adequacy grounds (see "Judging Money", research, p. 68). Indeed, through the first half of 2006, funding mechanisms in only five states--Delaware, Hawaii, Mississippi, Nevada, and Utah--have been spared constitutional challenge.

Few would seek to deny American public school students access to the courts when inadequate school funding threatens their chances for achieving academic, and ultimately economic, success. But contemporary school-finance adequacy litigation goes far beyond seeking equity for the educationally disadvantaged. The movement is becoming a self-serving cause whereby plaintiffs have gained relatively uncontested judicial access to the policy process. Indeed, unsubstantiated claims and unreasonable requests contained in costing-out studies commissioned by plaintiffs have successfully circumvented democratic executive and legislative funding dynamics. The trend threatens to erode public interest in and support for K-12 education policy.

Plaintiff Victories Pick Taxpayer Pockets

Amounts awarded by courts are often substantial. Wyoming and New Jersey are fine examples. Wyoming plaintiffs have returned to court six times and have so far doubled Wyoming's per-pupil spending, elevating it from $5,971 in 1996-97 to an estimated $12,422 for 2006-07. Beginning teacher salaries, for those with master's degrees, rose in constant dollars from $24,402 in 1997 to $32,451 in 2004, a 33 percent increase. The average student-teacher ratio declined from 15 to 1 in 1993 to 13 to 1 in 2003. In spite of dramatic increases in spending, Wyoming student achievement levels in math as measured by the National Assessment of Educational Progress (NAEP) have either been stagnant or dropped relative to the United States as a whole.

While Wyoming is a poster child for litigant success, it is not all that extreme. New Jersey per-pupil spending, in response to Robinson v. Cahill and Abbott v. Burke, has been elevated in constant dollars from $4,688 in 1970, when the litigation began, to $13,229 in 2003. So-called Abbott districts, those that receive the largest share of new state funding, in select instances spend in excess of $19,000 per pupil, a figure that rivals day-student tuition at many of the nation's most prestigious independent schools.

How Much Is Adequate?

The underlying question seems reasonable enough: after all, if Johnny's school is underresourced, how can Johnny be expected to meet the state's expectations for learning? The problem is that no one knows with any degree of certainty how much money it takes for Johnny to meet state-derived learning standards.

Ensuring that sufficient resources are available for all students to meet state-specified learning standards is a laudable policy objective. Unfortunately, contemporary legal petitions for resource adequacy go far beyond the analytic capacity of present-day social science. The evolving concept of financial adequacy requires researchers to ascertain far more elusive relationships between education inputs, processes, throughputs, and outcomes. Researchers have simply not yet discovered answers to many of the questions regarding these relationships. For example, the amount of money or configuration of schooling resources needed to compensate educationally for impoverishment, disability, or language deficiency is simply not known.

Still, court cases proceed, and even proliferate, with the primary evidence coming in the form of adequacy cost studies. According to ACCESS, a project of the Campaign for Fiscal Equity, Inc., a total of 58 cost studies had been conducted in 39 states as of January 2006. Of these cost studies, state courts initiated 7, state government agencies initiated 34, and independent groups initiated 17. At least 20 cost studies in 14 different states were undertaken between January 2004 and December 2005, with a potential for at least 5 additional studies in 2006. Two adequacy cost-modeling methods are employed most often: the econometric or cost function approach and the professional judgment approach.