• Content count

  • Joined

  • Last visited

  • Days Won


niche last won the day on November 10 2014

niche had the most liked content!

Community Reputation

128 Excellent

About niche

  • Rank
    Devoted Member
  • Birthday 01/01/1958

Profile Information

  • Gender
  • Location
    Lincoln, Nebraska, USA
  • Interests
    God, niche marketing, inventing, coding, building large scale models, sci fi

Previous Fields

  • Languages
    php, mysql, html, css, javascript (learning)
  1. My understanding is that PDO does the sanitation for you. Validation is a different issue.
  2. Bingo: This will help too: These will get you started Also, remember that google is your friend. Always consider googling your topic. A good google will always produce plenty of likely answers. That's especially helpful when almost everyone's asleep.
  3. This will get you started:
  4. The answer to your specific question is you'd use a $_GET or $_POST method.
  5. You probably have a problem when you bindParam especially when you expect a different row count. Is :voor really an INT? Also, you need to use try/catch for errors.
  6. dsonesuk's example is much simpler than the one I found:
  8. I use file filezilla for ftp client to upload.
  9. It's an array that gives you a place to put data that a user generates during their session. It's an important concept.
  10. We need to see more code. Else, look into the css box model, divs, and float
  11. Understood. I haven't encountered many limitations with mysql. Obviously this is one of them. Thanks for the reinforcement jsg.
  12. I think I'm hitting a limit that looks like a time limit, but it's probably something else. What I don't know. Here's my script : UPDATE du_raw2, cris4 SET du_raw2.sn2 = WHERE du_raw2.key1 LIKE CONCAT('% ',,' %') My localhost is a wampserver with mysql 5.6.17, php 5.5.12, and apache 2.4.9. My code parses mailing addresses. It targets the key1 column in the table du_raw2. The key1 column contains addresses (all the address components on a single line). cris4 contains all the street name possibilities for a specific city. There are 2000 of them. The script scans each address in the target file (when it has less than 2000 rows) and returns the street name if it's found (from the 2000 street name possibilities). It successfully updates 1000 target rows in a second. The query log says the connections are opened and closed.It successfully updates 2000 target rows in three seconds. The query log says the connections are opened and closed.It times-out somewhere between 2000 and 3000 target rows. The query log says the connections are opened, but never closed. When I reduce the number of street name possibilities from 2000 to 50, my script successfully processes 3000 target rows in 26 seconds. The query log says the connections are opened and closed. I expected it to take less than 26 seconds (causing me to think it's the way I designed the code, but then I'd need know what concept I'm violating). My script isn't in a loop (yet). I have 100,000 rows to process. I'm just peeling it back to figure-out where it chokes. It's not the data. It's the number of rows and or the way I'm applying the code. I'm using it with php, but I doubt php is the problem, I've already played with set_time_limit(). I suppose I could work around the problem in a while loop. It would only run 50 times. I'd rather know what the true problem is. What do you think? Where else can I look?
  13. Do you have any code so far?
  14. Here's the tutorial that will tell you how to find your css. Even if someone just gave you the correction you probably need, you'd still have to know how to find your css to make the change. This willl show you how to find it: