Census_2015.mp3 Transcript I'm going to talk a lot less technical about things, and I'm going to talk a little bit more from from a broader perspective in terms of what we're planning for the 20, 20 census and kind of our path to actually getting the 20, 20 census off the ground. So that's 11 of the things that I should probably say an introduction is in the 2010 census, the state of California was split between two Census Bureau regional offices. I think it was pretty much everything from Fresno South was was handled by the Los Angeles Regional Office of the Census Bureau and then everything north of that was handled by the Seattle Regional Office of the Census Bureau. And a few years ago the Census Bureau reorganized. So now they're there is no more Seattle Regional Office than the Los Angeles Regional Office will handle all of California. So the geographers that that Linda referenced, that's who you work you will all work with. As for the state of California, the all of that the whole state will be handled out of Los Angeles so that's a little bit different than what we had in 2010. But I want to start just talking a little bit about 2010. We have a graph up here that articulates kind of how we do in terms of census in in accuracy we measure it through something called the net undercount and it is important to emphasize that word net. I think sometimes we miss that. But this, this tracks the quality of the census for for every census since 1980. And you can see that as we made progress through the decade we got a little bit better culminating in what we felt in 2010 was probably our most accurate census in the history of the United States. We actually over counted by a very small percentage one 100th of a percent again on a net basis. But we felt that the accuracy of the 2010 census was was our best ever and it's something that we want to do again. So, so we like this graph and as we're looking at preparing for 20, 20 this graph shows how much it cost and this is in 20 $10 current 20 $10. Are there any economists in the room was there one. I heard one little mumble, no economists saw it. So I'm going to give you some numbers that that shows what we're projecting the 20, 20 census to cost but that's in 20, $20. So I'm going to need somebody to help me adjust 2010 to 20, 20, but this is what the cost of the census was in real dollars in 20, 20, $10 you can see that it's gone up in terms of the cost per, per housing unit. This graph we don't like and it's not just us that doesn't like it, it's the Congress that provides the appropriations to do the census that doesn't like this. So this Grassley like we want to see this again in 20, 20, we don't want to see this graph in 20, 20. And so it's not a surprise that most of our objectives for preparing for the 2020 census and before I leave this we project that the 2020 census will cost $124 per housing unit in 20, $20 if we just did things the same way. But we're trying to make some improvements and we've targeted four areas of innovation that we can do. And it's no surprise here that we're not looking to influence the quality significantly. We want to keep that intact, but we are looking to address the cost and we think that by introducing a number of innovations for 20, 20 we can drive that cost per housing unit down to about $88 per housing unit in 20, $20 for for the 20, 20 census. So these are the four areas of innovation where we think we can save up to $5 billion. And I'm, and I won't go through these in any great detail, but I want to just hit some of the highlights here because the next thing I'm going to do is talk about some of the testing we're doing and you can, you can see how what we're testing is contributing to these areas. So the first thing we want to do is re-engineer our address, canvasing, we want to look at how we list addresses in the field obviously the first thing that we do before we ever send a census questionnaire is we have to know where to send that census questionnaire. So we want to build a good address list a lot of what Linda talked about was targeted towards building a good address list for us. The next thing we want to do is optimize self response and that's just a fancy way of saying we want people to count themselves more than we want to count them. We get much better data, it's a lot more efficient. When you count yourself, you answer for yourself rather than a census taker coming to your door, calling you on the phone to get that information the third area that we think that we can make some improvements is in using administrative records. There's a lot of information out there about addresses and about people that potentially we can use without having to reinvent every single data set out there. And so we're looking at ways that we can incorporate that in so that we can reduce the cost. And then that the biggest area that we can save money is re-engineering field operations. When people don't send back their census questionnaire, we have to hire somebody to come knock on their door. And last census for the duration of the census, we hired over 700,000 people nationwide to go out and do some activity, whether it was listing an address, whether it was knocking on a door or whether it was supervising the people that did that. We think that we can save a lot of money by reducing the number of staff we hire and all the infrastructure it takes to support that so this is kind of our planning cycle. This is a graph, and I don't know how well you can see that on on your saying it's not really important except to know that we want to go from red to blue. OK, so there's a timeline, there's a path that we're going to follow that that has already begun. We're underway in terms of testing. And I'm going to go over some of the activities that we've we've already finished some of the tests that we've already finished and some of the areas we're focused on. And then talk a little bit about where we're headed with this test so one of the big tests that we we kind of started our research in a big way in 20, 15 and we did a number of tests there. And I'm not going to go through these in any great detail here, but some of the stuff we did was was kind of in office. We did this, this thing called a SIM X. Basically what it is, if you guys have done any modeling or anything, you just you just throw in tons of data into something and wait until the model breaks. We, we did something like that using a lot of people. So we just did kind of an office replication of a census environment and threw a bunch of data into a system and sort of tried to find the capacity of that that sort of gave us some ideas in terms of some of the infrastructure issues we're going to address from a systems standpoint. The next thing that we did is we started looking at some address validation. That was that was a test that we did to to kind of get a sense of of what would work in terms of getting a good address list. And what wouldn't do wouldn't, wouldn't work. And it began to help us refine this focus of how we were going to do address canvasing in terms of do we need to send a census taker out to walk that street or can we do use something else, some other method to to determine a good list of addresses in 2010 for reference, we sent a census taker to walk or drive every street in the United States and that includes obviously the urban areas. But we drove through a lot of remote areas where there was nothing there. But we, we covered almost I think was like 99.8% of the ground in the United States as part of our effort to list addresses in 2010 the next three tests I'm going to spend a little bit more time talking about but it was optimizing self response the 2015 census test and then the national content test. So let's let's go into those a little bit and we chose an area in Savannah, Georgia that was kind of a mix of sort of middle class and lower, lower middle class areas. And we just tried to test different ways to get people to respond to the census. We made a decision very early on that we were going to allow an Internet option, which we sit here now and say that's a no brainer. But in 2010 you could not answer the census on the internet because of the concerns about privacy and, and a number of the perception issues associated with that. So we started the testing process just to see how people would respond to the Internet. And we tried some things that we knew probably wouldn't work, but we felt that we needed to try them anyway. And we tried some things that we really thought would work and one of the things that we tried and this is one of those things that sounds sometimes good in a room like this, but when you try to do it with real people, it doesn't work. We decided that we would do a pre-registration. So it's something we called Notify Me. So we would send out a mailer and we would say, Hey, when the census comes sign me up and let me know when that census questionnaire is arriving. And so we sent out a mailing to a whole bunch of people and said, Hey, you ready to sign up and register to do to do the census? And nobody responded to that. What they did is they went to the website and they said, OK, I want to answer my census. But all we were doing is saying, sign up so we can tell you when to do the census. So we found out pretty much that the Notify Me approach wasn't one that was going to work, but that's something we tried and tested. We also started testing this idea of non ID processing that that name is going to change as the decade goes on because people don't really understand what that means. I'm not sure I understand what it means, but obviously one of the key things that we need in the census is matching not just how many people are counted, but we want to know where they're counted. You guys are big into redistricting, so that's very important to you. I remember the Citizens Redistricting Commission meetings that I got to watch you and Karen and a lot of the conversations that came out of that. But where people are counted, where people are counted is is important. So how do you associate the answers with the location on the ground? In the past, the only way we've been able to do that is to give people a census specific I.D. So it's a nine digit or a 14 digit code or something that that is linked to your address on the ground. That's a that's a challenge in terms of how you get people to participate. They have to know what that code is in order to answer the census. What we began to test in 20, 15 is what if we don't require them to have that code when they answer the census and they either as part of their response, they can code themselves. So then we can match it up to a census ID or after the fact we can take that answer and then do some research to code that census I.D. And we found as part of this 2015 test that that was something that had some promise to it particular in urban areas. I mean an address is an address. And if somebody codes themself to that address, why do we have to force them to have that 14 digit number in order to answer the census. So we began to to sort of show some promise in terms of non ID processing where you could answer the census without having a specific ID code to to provide your census answer. We also looked at expanding how we're going to engage the public in terms of answering the census. How many of you completed your census form in 2010. I hope all of you did. If you follow the national pattern it's about 65% of you answered the census and we had to go knock on about 35% of your doors. We want to continue to make the public aware so that they will answer themselves. And we do that through a number of ways of doing it. We do Super Bowl ads and different things like that, but we're finding that there's increasing opportunity and things like social media. How do we motivate people to respond? Imagine if on your Facebook feed thing comes up and says Hey, you didn't answer the census, answer the census. I don't know that that's going to happen, but that's something that's available to us this time. That wasn't before. So we're looking at that so we found a number of key findings that have kind of covered here that basically we still have to get engage people to get them to answer. They're not just going to answer out of their their own interest in doing good. They need to be motivated to respond preregistration is a lousy idea. But now we know. And then the third thing was is that non ID processing holds some promise for 2020. So we wanted to continue to look at that. We also did a test in 2015 in Maricopa County and it wasn't the entirety of the county we picked a select number of areas throughout the county, about 165,000 housing units. I can't see that number but it was, it wasn't a small test but it wasn't a million housing units or anything. And we started looking at how do we follow up on people that don't answer the census. So we, we started we used it to do to test our self response options. We gave them a web option, but we also incorporated some new ways of managing our field staff in this test and to see if there's things that we can incorporate to reduce that that workload that we go and follow up on people who don't answer the census. So one interesting thing that we found is that and I always get this number wrong so please don't quote me on this, but we in the 2010 Census, we followed up on lies. I said about 35% of housing units didn't respond to the census, about 30%, I want to say it's like 33% of the addresses that did not respond to the census. Were vacant. So we actually hired somebody to go knock on a door where nobody lived there. I mean that's what you have to do, right? Well, maybe not. And that's where this concept of administrative records comes in because there, there are data sources that show that there might be addresses that are unoccupied. And so we started testing this in the 2015 test. Could we use external data to, to not have to send a census taker to an address that we already know is vacant? And so that was one of the components of this test that we, we used. If we're successful at that, we can reduce the number of people that we have to hire to go knock on doors. Some of the other things that we, we incorporated were our process in 2010 was to have a census taker try six times to get an answer from somebody. And, and so they would if, if you didn't answer your your census you would get knocks on the door six times until we gave up and tried something else. And that's sort of the general guideline. I mean I know from some of the phone calls I dealt with this that we tried a lot more than six times at some households. So we have to pay somebody to go out and knock on that door six different times. And that's, that's an expense. What we tried to do is incorporate some of the things we know about addresses and some of the other things that we can we can use to to make that enumerator more efficient when they try to make that follow up attempt how many of you have ever gotten a package at your house from FedEx or UPS? Just to be fair? So so FedEx and UPS use a lot of routing and time technology to to track the efficiency of their drivers and the efficiency of their delivery patterns. We're incorporating some of the same logic in terms of how we have enumerators go out and follow up on addresses. We give them smart phones we give them a list of addresses that they need to go to, and we give them that list in sort of a sequential order. And we say, you need to go here first. You need to go here next, you need to go here next. That's very different than what we did in 2010, where we gave them a pile of questionnaires, questionnaires, and we said, You have a week to get these done, bring them back when they're finished. But we left it to them to decide. Now we give them a list of addresses to contact that day. They tell us when they're available to work because of the technology, and we can give them about enough work for a day. They go out and they do that work and when they're done they tell us what they can do the next day, or maybe they want to take the day off and we don't give them work. We can be a lot more efficient in terms of how our enumerators work and how we distribute the work. And so we also incorporated some of the X data to identify places where people are likely to be home which that sounds like it makes sense, but X tells us, you know, when people leave for work and different things like that. So so we could incorporate some some new methods and we found as a result of the 15 test that those things were good decisions. And it does look like we can improve the productivity and efficiency of our staff on a scale on a census scale that means big things. So if we can make even slight reductions, when you apply that to a workforce of 500 or 700,000 people, even small percentages can generate efficiency and reduce officers and reduce the number of supervisory staff that we have to put in place. So that had the potential of big savings which the 15 tests showed us so it isn't just the processes and kind of how we engage the public, it's also what are we going to ask them. So we're in progress with the national content test, and that's where we are testing some some new things in terms of how we ask the public questions some of them are new ideas, some are just different ways of asking things. And, and this one is one that probably generates a lot of interest from folks because we start looking at not just how we do it, but what we're getting from the public. So I want to make sure that I walk through this one in in some specificity because I get lots of questions on this, and I don't want to get this wrong. So the areas that we're looking at testing our race in Hispanic origin and we're looking at relationship and then within household coverage. And so the idea is that we want to make sure that these estimates are focused on the self response and Internet response rates so that we want to see the effect of the Internet does that change the dynamic of how people respond? And does changing the ratio of of self response change the patterns with how people respond so we're putting a huge Internet push out there in terms of the question, different, different types of the questionnaire we're getting the responses but we're also putting out a whole bunch of quality evaluations. So households that respond, many of them will get a callback or two callbacks or follow up attempts just so that we can validate this information. And you can sort of see that this is ongoing. We expect to complete this self response period the end of this month and we'll start getting some conclusions so these are the key content areas of what we are testing on this national content test. Race. Is this of interest to you? I don't want to I don't want to bore you if you're not interested, but that the race in Hispanic origin this is always a very stimulating topic. When you when you go out and talk about census, how do you ask questions on race and in particular this issue of Hispanic origin is Hispanic or race? Well, the Census Bureau says no and B, says no. But well, now we're starting to sort of assess whether that's the right approach. So we're testing whether we keep those two concepts separate. And I think all of you know this, but with the Census Bureau considers race differently, than Hispanic origin, they're independent concepts. So you can be of any race and you can be Hispanic or not Hispanic. Now we're starting to incorporate those and combine them into a question to see what does, what does, what's the effect of that? We're also looking at adding a new race category that we call MENA which is Middle Eastern or North African. And we're getting some insights into into whether that's an effective way to to to count people and how people identify with that. We're changing the instruction, wording and question terminology. Some of you are experts on this, but I'm always fascinated with how the numbers change depending on the sequence of of how questions are answered. If you put black as the first category, you get a different number than if you put black as the second category. If you put the Hispanic origin question first, you get a different question or different number than if you put the race question first and then the Hispanic origin. So we're doing lots of research, not just on the content but on the sequence of sort of how we ask those questions. And then we're looking at when you move from a paper form to a web based form how does that change things? We can obviously with a web based form, get lots more, provide lots more options. You can bury things in a drop down on a long dropdown list that you couldn't put on a paper questionnaire. You can't have 32 options on a paper questionnaire, but you can on a web form. So so we're we're looking at how does that affect things and what options are available to us now with the web based questionnaire relationship is focused on primarily two areas. The same sex marriage is, is, is, is going to be one thing that we're we're looking at how we ask that. And we're also, I think, reintroducing foster children. I think we've asked that in the past. I think we're we're adding that now as an option or testing that for for the the categories. And then within household coverage, this is probably a little less technical. It's it has to do with the definition of of of households and how do we ask about households. And I got to be honest, I don't know a lot about this. The difference between rules based approaches and question based approaches but we've traditionally asked a lot of validation questions like have I missed any babies or small children? Is there anyone else living here? Do you have an elderly relative? Those kinds of things. We're looking at what generates the best complete household coverage. It's fascinating how many times people forget to mention that they have a baby. I mean, it's just fascinating. You think, you know, that's the light of their life, that they forget them on the census. So we do have to sort of insert these these validations for for coverage. So it's not just the content. We're also testing how we introduce people to the census. When you give people a web based option, how do you get them to find a URL? I mean, I always love at the end of our end of our PowerPoints, we always put in the URL right here. We say thank you. Here's the address. You should go look at how many of you go back and take that end. If you're like me, you just go back and you say, well, yeah, these are the smart people. You go back and I say, oh, yeah, I want to look on that new Google it or whatever, and you try to find it that way. But how do you make people aware that this is the URL to use? How do you make sure that the URL you're giving them isn't something that's stolen from somebody else? You know, there's a lot of phishing or scamming going on. How do you how do you make sure that they're going to the correct your URL and the URL you want them to? So we're trying a whole bunch of different strategies. You can see them here we're going to try testing if if we just send you an email and a link, how do we know it's you? I mean, that's some of the challenges of using just email we're trying postcards, we're trying postcards and then a questionnaire. We're trying questionnaires with the letter. We're trying just a letter, just a bunch of different strategies to see what generates the most response and what is the most efficient way to to do this and get people to respond. So you can see the sequence here. I won't go through all of these options, but we're trying a number of different things yeah. So that's another thing. How do you get email addresses? 11 of the ways was I mean, you can get email addresses administratively you can purchase lists of email addresses. We're not super optimistic that this is going to work, but there's also this registration option that we tried where people can provide us with email addresses in advance. So again, that that's not something we're optimistic about, but we want to at least try and test it. We might find that there's some great directory out there that has that I think a lot individuals have. Yeah, I'd like to. Yeah. The one that you use to register for everything. So all the spam goes to that one. Not that I know anything about that so I want to talk also about what we're getting ready to test. And this one's a little personal because we're doing this in Los Angeles County. And I actually think this is great that we're at we're doing a test in California because things that happen in California's generally have broader application to the rest of the country. So this test is going to be designed to start putting a lot of this stuff together. Two days ago on Tuesday, we announced our operational plan that any of you watched that, did you? A few. But the rest of you, you missed a stimulating day of stuff. I was I was trying to watch the Twitter feed to see, you know, is anybody commenting on this? And all of a sudden my timeline blew up because I put in the word census. And I thought, wow, a lot of people are watching this. What is this all about? Well, at the same time, we were doing our thing. They announced the census of gorillas in Botswana or something. And everybody was commenting on the census of gorillas trying it. So so, John, you didn't you didn't put it in your Twitter timeline, anything about the sense at some point I was. On the gorilla. Oh, yeah. OK, OK. So we just announced this. So the 2016 test is now we've moved from this concept of experiments and trying different things to now beginning to refine our test, focused on what we're going to do in 20, 20. We've made some decisions about some things which we shared in the on Tuesday and so now this 2016 test is sort of the first solid test of things that we're going to do for 20, 20, so we're doing it in two places, we're doing it in L.A. County and I'll show you a map and just this or you can sneak ahead and look at the map on, on the next page and then we're doing it in Harris County, Texas, which is Houston. And in those two areas we're focused on a number of things. We're looking at places where there's language diversity and again, with the introduction of technology, Webb Webb responses, how do people respond in languages other than English and not just languages other than English, but how do they respond in languages that don't use Roman alphabets? So we're we're looking for areas with high concentrations of people that speak Chinese or Korean, Vietnamese. And so you'll see that the area, if you're familiar with L.A, is heavily concentrated with people that speak Chinese. We're looking for places that have demographic diversity or are hard to count. We're looking at places that have high vacancy rates guess what? This area in L.A doesn't have high vacancy rates, but the place in Houston does. So they're going to test that there. And the reason we want to test a place with high vacancy rates is so we can continue to refine that administrative records approach for removing units before we send a census taker out. And then we want to look at a place where people use the internet differently. Some places very robust in terms of technology. I mean, this is why we're not doing this. This test in san francisco in the bay area. Sorry, but there's a lot of internet use there. So so we're picking some areas where where there's varying levels of internet usage and then they want to look at different time zones. So this is approximately 225,000 housing units. Each of these test areas. And we're going to be focused on continuing to improve the way our census takers do this census. We're going to try different ways of this self response. This is where we're going to really start refining how we notify the public how to do the census. Do we just send them the postcard that previous tests have shown that the most promise to send them the letter and then to follow up with a paper questionnaire to those that don't respond? Is that going to be the best source and does that because it worked in Savannah, Georgia? Is it going to work in L.A and in Houston? Those are some of the things that we're going to try and then work on. We're going to try some other things. This you'll see on this last line, use of text messaging. We're not going to be doing that. Sorry, no text messaging. So you can just line through that. But we're not we're going to be really looking at this non ID processing and here's why this is a big deal. Most of our efforts in 2010 were focused on engaging the public to participate and our messages were go home, look for the form, fill it out and send it back. Or if you didn't get the form call this number, they'll mail you another form, fill it out and send it back now what this provides us is an opportunity for people to answer the census anywhere, any time, regardless of where they are. So in a community event like this, let's say that none of you filled out your census form and we got you all in a room. This is very conceptual because all of you will fill out your census form. I know, but we bring you in. We could set up a kiosk with a couple of tablet computers and say, Hey, thanks for coming here's the donuts. If you want to donate, you've got to fill out your census form. I mean, and we could get the census responses right there. The non ID non ID process allows us not to have to wait for a form to arrive at your house or something like that. So we want to test that and make sure that that works. We want to test it both to make sure that we can code people to the proper thing. But we also want to make sure we don't introduce a bunch of fraud in the process and allow people to just start submitting lots of questions. So we're working on that. This is the test area for for L.A and California it's if you're familiar with this area at all, it's the San Gabriel Valley. It's really a fun test area and I'm looking forward to being a part of this. I I think we actually have anybody live in this test area. Anybody. All right. Oh, OK. Well, then you won't be tested we're also doing an address canvasing test. And this is this is again trying to refine this ratio of how much of this we do in a in an in office environment versus how much we do actually in the field. Of work. We're testing this on a number of fronts where we're assessing, you know, can we use aerial photography, can we use digital address lists those of you that are familiar with gas, you know, you can get pretty close to a comprehensive address list by just using the tools on your desktop. But there's someplace that's where there's things like trees get in the way or cloud cover or whatever. Where where do we need to distinguish between being able to do this at your desk and going out to do this? So we're doing this in an office environment and then we're sending out a team of census takers that that work on this for for a living. And they're going to go draw areas or they're going to go out and actually list these areas. And then we're going to compare the results and so that's going to be done throughout 2016 so that's that's kind of what's current now. We're going to continue to do some more testing as we approach 20, 20, 2017 the tests will just get bigger. Well it will start integrating more things, we'll start pulling in more pieces and I'll show you what some of those pieces are in a second. In 20, 18, we do something in the past we've done a dress rehearsal I think in the last decade we did it in Stockton was our dress rehearsal. We haven't announced the sites for, for 2018 but or actually for 2017 either. But we'll start picking areas across the country. I know we'll be testing some tribal areas in 2017 to see how this works on American Indian reservations and we'll will probably incorporate some more rural areas as well so this is kind of what we released a couple of days ago. And I mean you get a sense of of the complexity of what we're covering these this is the consolidation there's 34 different functional areas in our in our operational plan and each one of those 34 areas we're incorporating as part of a of an overall testing strategy. Some of it is linked to the past where we're taking the lessons learned from 2010. But a lot of it is incorporating things that we know or things that we want to try in terms of getting ready for, for 20, 20. That documentation is available online and just like I said I, I do those of you you can go in and type in this URL the census dot gov last 20, 20 census and the full set of materials is available online if you want to look at it. If you're like me and you like to look at pictures this is the whole operational plan of the picture. So it's kind of like a cartoon, not a very funny cartoon, but you can get a sense of the whole operational plan there. We essentially move from building a list of addresses in the lower left corner all the way to releasing the census results, although in the lower right corner and we kind of go through this cycle of activity is there so that's the overview. Any questions. Yeah. So regarding the possibility of combining and I think it's one of the things that showed that overall higher response rate. But I was wondering if that was the higher response rate. Which result in more, I guess more people responding in some other ways category. They might just check. The static higher number of maybe 22 or. So. Yeah, I think it's I mean obviously the ratios will be different depending on how we ask the question. And there's there's a lot of concern about how people change the way they answer and, and what that does in terms of how we tabulate the data. But I think another big issue for us is how do you make that comparable to previous censuses? Because I mean, the value of the census is sort of looking over time. And those of you that have been doing this for a while know that when we introduced the multi race category, I mean, that kind of really made things complicated in terms of comparing data over time. So that's that's a big part of the content test is to assess not just what happens with how people respond, but sort of how we match that up with previous censuses. There will be a lot of opportunities to comment on this. There's going to be if you guys are familiar with Federal Register notices, I mean there's going to be a lot published in the Federal Register. And in terms of opportunities for comment, there's a national advisory committee meeting that's happening right now. The chair of that advisory committee, as some of you may know, data is cataloging who's from the state of California here. So, I mean, California is well represented on that they will be very focused on this issue from some a broad perspective of race and ethnicity as well. Yeah, to expand a bit on the what might be expected going forward regarding administrative records, especially for states involved and what you need from the states and the detail is going to be coming from. Yeah, we want it all, John. It's everything we want. No, no. So so there is there's a whole effort devoted to assessing what is a usable administrative record and what are we going to use right now? The the administrative records that we've that we're using are limited to those that have national coverage. So they're primarily I think they're all federal data sets. So we get data from like Social Security Administration. The IRS data is an administrative data set that we use. So I think right now we're just kind of using those to incorporate things we have started conversations with or internally with how are we going to incorporate locally tailored administrative sets. So, for example, Southern California Edison has a great database of of addresses where they've turned off utilities. I mean, that might be a good measure of whether or not that unit is occupied or vacant. That could be a good source but Southern California Edison doesn't do us much good in Idaho. So. So we're thinking about how that will work and how will incorporate those things. So in terms of what I'm going to ask you to do at this point, I don't think much until I know really what to ask. I don't want you to spend a lot of energy organizing administrative records for that purpose. I will tell you that I started conversations with some of the mayors and other elected officials about two things that I want them doing right now that I shouldn't say I want them that I'm suggesting that they they do right now. And that is to start looking at their out year budgets so that if they do want to support census promotional activities or outreach activities to start putting those line items in their budgets, now so they don't have to steal money from the parks program to fund those efforts. And in 2019. So that's one thing. The other thing which is of interest to you in this room is I'm telling them now would be a good time to start consolidating your list of addresses because we're going to come to you with this lookup program and it would be good for your city to sort of have a sense of what address information you have and in making it work together because when we ask you for your list of addresses we're not asking for your list of addresses from every department asking you for a list of addresses. So and they've been very responsive, the ones that I've talked to already are very interested in trying to do what they can to, to do these activities. Yeah, slide number three about the cost of housing. Yeah. You know, it's always. All of the information with the. Policies is who's the boss? What does that cost of 40,000? So we'll find that stuff. Yeah. So, so that's a great question. Why does this line go up? The why doesn't it stay steady that there's, there's a number of factors associated with that. But I think probably the biggest factor is that the population that doesn't respond is becoming more difficult to count. It's it's a much harder to count population. And there's the effort to count them is it takes a lot more. So I mean, I wish I was a better numbers guy, but as the population grows, even though the percentages remain roughly the same, I want to say the final male response was 67%. So that means that about 33% of the population, we had to go knock on their doors 33% of the 2010 population is a lot more than 33% of the 2000 population. So that even though it's a per housing unit cost, the number of people that don't respond and the complexity of how hard they are to count requires a much greater infrastructure. Break out between those who responded initially, the 67% and how much more it cost to cover the 30 plus. Yeah we could, we could. Yeah. We actually have that data in terms of what it, what a self response costs us versus what it costs us to go knock on somebody's door. And I think that cost is something on the order of about eight to nine times greater for us to go knock on a door versus somebody that does a self response. So it just it's that there's, there's other factors. I mean certainly the the cost of hiring somebody has become more expensive, the administrative costs of sort of bringing people on board, it's more expensive. But I think the biggest thing is just it's a much harder to count population. And this isn't unique to census. I mean, we're the we're the biggest game in town. But in terms of other organizations that that measure population or collect data, they're seeing very similar trends in terms of public rates of cooperation. I don't know if you saw this, but Gallup announced yesterday that they're just not even going to do a presidential poll because they can't get people to answer their poll. Me Wow. Gallup not doing presidential polls. That's like a big deal. But just getting the public to cooperate has been a more difficult effort. I mean, if you're spending $4 to. We're also going to feel less. That includes everybody. Right? But if you just say we'll do that because that is so important. So so this is what it is. And so actually, that's something that has been entertained before is to do some kind of a census lottery so that, you know, if you answer your form, you'll be registered to get $1,000,000 or whatever it is. But I mean, obviously, there's issues with that. And so we haven't entertained that. But it's a great point I mean, if I gave you a hundred bucks, would you answer the census? Yeah. Yeah. Even I figured you would you do it for 50? It does. Yes. All right. All right. All right. So, OK, so how many census? We have no official statistics. So so yeah. So we're going to have right now we have six regional offices for the Census Bureau, and we're going to keep that ratio for the 20, 20 census. And just to give you a sense of, of scale in terms of how these cost savings will work in 2010 we had 12 regional offices that manage 455 local offices and one right here in Sacramento in 20, 20, we're planning six regional offices to manage up to 250 local offices, so we'll cut the number of officers in half nationwide and that's because we're cutting the number of enumerators we hire almost in half that's big bucks. All right. Thank you very much.