Jump to content

confused and dazed

  • Posts

  • Joined

  • Last visited

confused and dazed's Achievements


Member (2/7)



  1. I did some basic internet searching of the error I got clicking on the view source - inspect - .js link the error is "Not allowed to load local resource" the answers were jquery is not allow to access your local files. I actually don't have it in my server I miss-spoke about about. I just have it in my local files. I'm sure you would have mentioned that had I not misled you. thanks for pointing out that I can use view source and click on links use inspect to view errors on that. THANKS!
  2. It does not... But all the other jquery functions on my page are working though... just not the fadeOut
  3. Hello internet. I am trying to use the following code and my webpage is not accepting my fadeOut function I'm using <link> to grab the jquery library from my server. CODE: $("div").click(function() { $(this).fadeOut("slow"); }); However, if I replace <script type="text/javascript" src="jquery.min.js"></script> WITH <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.1.0/jquery.min.js"></script> It works just fine. Why is this? NOTES: 1. I am placing jquery code within my <head> tags. 2. jquery.min.js is the 3.1.0 compressed file I have on my server
  4. What's strange is that Safari is the only browser that is doing that... why would that be? I created the image in Microsoft Excel and saved it as a png in a picture editor I use. I like Microsoft excel's shapes and effects so I tend to create images copy them and paste them into a picture editor to finish it up and save it as a png. I placed the image as a button background using a class callout referencing a remote CSS file - code below. I have never had an issue like this and I have used this technique for the past several years. Source Code: <div class='sb1'> <input type='button' name='submitb' class='b1' value='Begin Scoring' id='submitb' onClick='check();'> </div> CSS remote Code: .sb1 { width:200px; height:45px; position:absolute; margin-left:30px; margin-top:30px; background: url(b3.png); border: 0px solid black;} .b1 { width:200px; height:40px; position:absolute; margin-top:2px; background-color:transparent; font-size:12pt; font-family:Calibri; color:#ffffff border: 0px solid black;}
  5. Hello Internet, I'm having an issue with how Safari is rendering a picture on my smartphone - but not chrome. Any thoughts on why this is happening? Please attached image.
  6. What other options do you recommend? If PHP isn't the tool for the job then what can I start to try and teach myself? BTW - I used the header() call to redirect and it worked. One Page two redirects.
  7. Ok. Having said that - the networks that I am using I have no control over them so it appears I need another way around this. First I thought I would try to streamline my code through the routines and functions I am asking it to do but the fact remains I need to cURL 50+ pages and send data to mysql. I cannot get around this. So what I have done is limit the requests to 20 and string the php files along. my_php_file1.php handles the first 20, my_php_file2.php handles the second 20, my_php_file3.php handles the remaining requests. At the end of each php file I have a simple form with a submit button to go to the next php file. Any thoughts on what might work better than having to submit three php requests (in series) to complete all 50+ requests?
  8. OK so I did this and the Network Tab is showing instantly the php file that the javascript function sends it to. The Network tab in the first window shows: Name - the_name_of_my_php_file Method - post Status - 504 Type - text/... Initiator - the_name_of_my_js_file Size - 68B The time shows "pending" until it cuts off and then showed - 45.99s Also the Timing tab in the second window shows: Stalled - 1.818ms Request sent - 0.299ms Waiting (TTFB) - 45.99s content Download - 0.782ms Also on the Preview & Response tab in the second window shows: "Failed to load response data"
  9. So I did a bunch of test runs and it appears my chrome browser only wants to run 24 of the 52 requests with positive feedback through the echo statements that the code has executed (I used a die statement to limit the number of requests). Chrome will only leave the browser running for 41 seconds. I looked in the setting for the browser and there does not seem to be any settings for this. Any thoughts?
  10. I called my hosting account and they said to check the php5.ini file and change the max_execute_time I changed it to 300 and it is still the same thing. The guy seemed pretty knowledgeable and he said the php5ini file controls that for my hosting account.
  11. Something that is interesting is that the page wont load and show the echo statements I have in there. However, over time as I refresh my database all the 50+ lines end up showing up in my database. It will show 20 then I refresh and it shows 30 and I refresh and it shows 40 etc.... But I do this all after the php page has stopped showing the refresh arrow circulating in the tab. What does all this mean?
  12. I timed it out and the page stops running after about 30 actual seconds
  13. Does it matter where the call is placed? Because it is still appears to be stopping before the limit and not completing the execution of the code. Right now I have placed as shown below: <?php set_time_limit(200); $con = mysql_connect("source","db","pw"); if (!$con) { die('Could not connect: ' . mysql_error()); } mysql_select_db("db", $con); while ($row = mysql_fetch_assoc($result)) { $ch = curl_init(); curl_close($ch); $dom = new DOMDocument(); @$dom->loadHTML($postResult); foreach($dom->getElementsByTagName('script') as $links1) { $links2 = $dom->saveXML($links1); if (preg_match(something)) { mysql_query("UPDATE something SET this='bingo' WHERE name='thisstuff'"); } } } ?>
  14. So I am well on my way to getting my next project up and running. Thanks for the help through this. I do have another question though, my program seems to time out and the code cannot complete all the commands on the rows "$row" from the database. In all I'm grabbing about 50+ different webpages, scrapping the data I care about, and then sending it to my database (one page at a time). Is there anything anyone can suggest for this issue? while ($row = mysql_fetch_assoc($result)) { $username='usr'; $password='pas'; $ch = curl_init(); $agent = $_SERVER["HTTP_USER_AGENT"]; curl_setopt($ch, CURLOPT_USERAGENT, $agent); curl_setopt($ch, CURLOPT_URL, $thislink ); curl_setopt($ch, CURLOPT_POST, 1 ); curl_setopt($ch, CURLOPT_POSTFIELDS, 'user='.$username.'&pass='.$password); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, 1); curl_setopt($ch, CURLOPT_COOKIEJAR, 'cookie.txt'); curl_setopt($ch, CURLOPT_COOKIEFILE, 'cookie.txt'); $postResult = curl_exec($ch); curl_close($ch); $dom = new DOMDocument(); @$dom->loadHTML($postResult); foreach($dom->getElementsByTagName('script') as $links1) { $links2 = $dom->saveXML($links1); if (preg_match(something)) { mysql_query("UPDATE something SET this='bingo' WHERE name='thisstuff'"); } } }
  • Create New...