Linux

For tech wizards and n00bs alike. Questions, answers, or just general hoo-haa.

Moderator: Moderators

Message
Author
Templar GrandMaster
Posts: 706
Joined: Fri Dec 08, 2006 7:06 am
Location: Behind my computer.
Contact:

Re: Linux

#46 Post by RobbieThe1st » Tue Mar 24, 2009 7:32 am

Ok, I have a question for you all:
I am running some complex PHP scripts, and one thing I am experimenting with is launching a child/worker script to do something, while the main script goes on to do other stuff(launching and detaching a program).

I found out that I can use PHP's popen function to run a process, use fwrite() to write data to that process, then pclose() to close it.
Ok, fine.

Code: Select all

//open child process
$child = popen('php ./snapshot_child.php','w');
fwrite($child,serialize($input)."\n".serialize($names['known'])."\n".serialize($names['unknown']));
pclose($child);
This works great! I can get my sent data in the child script by using fgets(STDIN);. However... it isn't detached. So, when my child process takes 30 seconds, my main process simply waits until its finished before continuing.

Alright, I thought, why not use screen to separate it? I know that I can use screen php ./script.php to launch a script that doesn't end if you close your SSH window.
I found out that screen -dm will auto-detatch, which is what I want.

Code: Select all

//open child process
$child = popen('screen -dm php ./snapshot_child.php','w');
fwrite($child,serialize($input)."\n".serialize($names['known'])."\n".serialize($names['unknown']));
pclose($child);
This works. It launches the script, and it is detatched... However, I can't get my passed data in my child process there.
I know I could simply make a temp file, save the data I want to pass in it, then open it in the child... but that seems cludgy.

Any thoughts?

Note: I am running CentOS on a VPS.
edit:
this might seem like a PHP question, but the PHP parts I have - I just don't know much about pipes, passing data, screen, and launching programs, which I am having trouble with.

-RobbieThe1st

User avatar
Consistently Inconsistent
Posts: 1725
Joined: Wed Jul 30, 2008 10:13 am

Re: Linux

#47 Post by aj » Tue Mar 24, 2009 10:15 am

RobbieThe1st wrote:Ok, I have a question for you all:
I am running some complex PHP scripts, and one thing I am experimenting with is launching a child/worker script to do something, while the main script goes on to do other stuff(launching and detaching a program).

I found out that I can use PHP's popen function to run a process, use fwrite() to write data to that process, then pclose() to close it.
Ok, fine.

Code: Select all

//open child process
$child = popen('php ./snapshot_child.php','w');
fwrite($child,serialize($input)."\n".serialize($names['known'])."\n".serialize($names['unknown']));
pclose($child);
This works great! I can get my sent data in the child script by using fgets(STDIN);. However... it isn't detached. So, when my child process takes 30 seconds, my main process simply waits until its finished before continuing.

Alright, I thought, why not use screen to separate it? I know that I can use screen php ./script.php to launch a script that doesn't end if you close your SSH window.
I found out that screen -dm will auto-detatch, which is what I want.

Code: Select all

//open child process
$child = popen('screen -dm php ./snapshot_child.php','w');
fwrite($child,serialize($input)."\n".serialize($names['known'])."\n".serialize($names['unknown']));
pclose($child);
This works. It launches the script, and it is detatched... However, I can't get my passed data in my child process there.
I know I could simply make a temp file, save the data I want to pass in it, then open it in the child... but that seems cludgy.

Any thoughts?

Note: I am running CentOS on a VPS.
edit:
this might seem like a PHP question, but the PHP parts I have - I just don't know much about pipes, passing data, screen, and launching programs, which I am having trouble with.

-RobbieThe1st
Well, screen works by creating a virtual terminal, so if you auto-detach it, you don't get access to the terminal where you're doing the processing anymore unless you re-attach it, which kind of defeats the purpose of attaching it in the first place I believe.

What happens if you open the screen with the first command, send your information, then send the command sequence to detach the screen? (Usually Ctrl-A, D.) Not too sure if it'll work - don't have any experience with passing escape sequences through a pipe honestly.

Also, it might result in you having an incredible number of screens open after a while - not too sure what will happen if you don't quit/kill the terminal, since screen will cause the session (complete with terminal) to persist.

I'm just wondering if something like this [ link ] might be better, considering that you're serializing the data anyway - just pass it as a variable to the script.
Of course, it would require a shell script, but I think those are fairly easy to hack together... At least the ones I've looked look fairly easy. :wink:

Good luck!
avwolf wrote:"No dating dog-girls, young man, your father is terribly allergic!"
y̸̶o͏͏ų̕ sh̡o̸̵u̶̕l̴d̵̡n̵͠'̵́͠t͜͢ ̀͜͝h̶̡àv̸e͡ ̛d̷̨͡o͏̀ne ̶͠͡t҉́h̕a̧͞t̨҉́.̵̧͞.͠͞.͟

User avatar
Administrator
Posts: 7006
Joined: Wed Jan 17, 2007 5:33 pm
Location: Nebraska, USA
Contact:

Re: Linux

#48 Post by avwolf » Tue Mar 24, 2009 6:47 pm

I'm going to guess that your problem is that your I/O is still blocking, Robbie. You'll want to disable blocking I/O on your stream: http://us2.php.net/manual/en/function.s ... ocking.php But that page indicates that popen's streams are always blocking, so proc_open might be a better option, since you can set its streams to non-blocking.

Now, it's been a few years since I did forked processes, and I've never tried to do it in PHP, but that's my first impulse.
Image

Templar GrandMaster
Posts: 706
Joined: Fri Dec 08, 2006 7:06 am
Location: Behind my computer.
Contact:

Re: Linux

#49 Post by RobbieThe1st » Wed Mar 25, 2009 5:08 am

aj wrote: Well, screen works by creating a virtual terminal, so if you auto-detach it, you don't get access to the terminal where you're doing the processing anymore unless you re-attach it, which kind of defeats the purpose of attaching it in the first place I believe.
I can see that now.
What happens if you open the screen with the first command, send your information, then send the command sequence to detach the screen? (Usually Ctrl-A, D.) Not too sure if it'll work - don't have any experience with passing escape sequences through a pipe honestly.
I don't know what I am doing wrong, but if I just open it with 'screen php ./script.php' it gives me a 'Must be connected to a terminal.' error:

Code: Select all

opened link 0.00164
string to be sent0.001782: a:3:{i:0;s:1:"2";i:1;s:3:"#wg";i:2;i:1237956773;}
a:12:{i:0;a:2:{i:0;s:2:"29";i:1;s:9:"spanthrax";}i:1;a:2:{i:0;s:4:"1777";i:1;s:10:"Big Al 002";}i:2;a:2:{i:0;s:3:"423";i:1;s:12:"Jadi Simondz";}i:3;a:2:{i:0;s:3:"149";i:1;s:12:"School_Boy19";}i:4;a:2:{i:0;s:1:"2";i:1;s:12:"RobbieThe2nd";}i:5;a:2:{i:0;s:3:"322";i:1;s:10:"Aardvark39";}i:6;a:2:{i:0;s:2:"78";i:1;s:8:"Ranma344";}i:7;a:2:{i:0;s:3:"753";i:1;s:12:"Keesie Kille";}i:8;a:2:{i:0;s:2:"23";i:1;s:12:"Theevildead2";}i:9;a:2:{i:0;s:2:"49";i:1;s:12:"Multikill529";}i:10;a:2:{i:0;s:4:"1300";i:1;s:10:"Dieyou2000";}i:11;a:2:{i:0;s:2:"13";i:1;s:9:"Sonixpber";}}
a:8:{i:0;s:9:"Glenn|AFK";i:1;s:13:"[Wg]Billy|GhM";i:2;s:13:"Mr_Brightside";i:3;s:9:"Rex|Sleep";i:4;s:10:"Zeth|Sleep";i:5;s:12:"Indivi|sleep";i:6;s:14:"[IM]RuneScript";i:7;s:1:"X";}
sent d1 0.001827
Must be connected to a terminal.
closed link 0.004276
The code for this:

Code: Select all

			//open child process
			$input = array($line['id'],$buffer['chan'],time()); //recipiant ID, channel, time
			$child = popen('screen php ./snapshot_child.php','w');
			echo 'opened link '.(microtime(true) - $start)."\n";
			$sendstr = serialize($input)."\n".serialize($names['known'])."\n".serialize($names['unknown'])."\n";
			echo 'string to be sent'.(microtime(true) - $start).': '.$sendstr;
			fwrite($child,$sendstr);
			echo 'sent d1 '.(microtime(true) - $start)."\n";
			pclose($child);
			echo 'closed link '.(microtime(true) - $start)."\n";
Also, it might result in you having an incredible number of screens open after a while - not too sure what will happen if you don't quit/kill the terminal, since screen will cause the session (complete with terminal) to persist.
I figured as much - my solution is to have a time-limit: PHP has this nice function 'set_time_limit(%time in seconds%);'. It seems to work.
I'm just wondering if something like this [ link ] might be better, considering that you're serializing the data anyway - just pass it as a variable to the script.
Of course, it would require a shell script, but I think those are fairly easy to hack together... At least the ones I've looked look fairly easy. :wink:
Two problems: One, the amount of data being passed may be large, depending on the circumstances. I recall from one of those comments that, at least on windows, there is a maximum limit of characters. Linux however... I dunno.
The second is that I have absolutely no clue how to go about writing a shell script.
Also, it seems kind of cludgy. If I can't get this stdin/stdout method to work, then I may have to use it, but I am hoping to get this to work.

BTW:
Out of curiosity, I used the above code without the screen in the command-line: It worked, but still is tying up main-script time.
I found that it doesn't actually tie up my main script until the 'pclose($child);'. I did try removing the pclose() line, however at that point it simply started tying the main script up when the current function ended(all this code is inside a function).

Code: Select all

opened link 0.00164
string to be sent0.001782: a:3:{i:0;s:1:"2";i:1;s:3:"#wg";i:2;i:1237956773;}
a:12:{i:0;a:2:{i:0;s:2:"29";i:1;s:9:"spanthrax";}i:1;a:2:{i:0;s:4:"1777";i:1;s:10:"Big Al 002";}i:2;a:2:{i:0;s:3:"423";i:1;s:12:"Jadi Simondz";}i:3;a:2:{i:0;s:3:"149";i:1;s:12:"School_Boy19";}i:4;a:2:{i:0;s:1:"2";i:1;s:12:"RobbieThe2nd";}i:5;a:2:{i:0;s:3:"322";i:1;s:10:"Aardvark39";}i:6;a:2:{i:0;s:2:"78";i:1;s:8:"Ranma344";}i:7;a:2:{i:0;s:3:"753";i:1;s:12:"Keesie Kille";}i:8;a:2:{i:0;s:2:"23";i:1;s:12:"Theevildead2";}i:9;a:2:{i:0;s:2:"49";i:1;s:12:"Multikill529";}i:10;a:2:{i:0;s:4:"1300";i:1;s:10:"Dieyou2000";}i:11;a:2:{i:0;s:2:"13";i:1;s:9:"Sonixpber";}}
a:8:{i:0;s:9:"Glenn|AFK";i:1;s:13:"[Wg]Billy|GhM";i:2;s:13:"Mr_Brightside";i:3;s:9:"Rex|Sleep";i:4;s:10:"Zeth|Sleep";i:5;s:12:"Indivi|sleep";i:6;s:14:"[IM]RuneScript";i:7;s:1:"X";}
sent d1 0.001827
Must be connected to a terminal.
closed link 0.004276

User avatar
Administrator
Posts: 7006
Joined: Wed Jan 17, 2007 5:33 pm
Location: Nebraska, USA
Contact:

Re: Linux

#50 Post by avwolf » Wed Mar 25, 2009 4:59 pm

RobbieThe1st wrote:BTW:
Out of curiosity, I used the above code without the screen in the command-line: It worked, but still is tying up main-script time.
I found that it doesn't actually tie up my main script until the 'pclose($child);'. I did try removing the pclose() line, however at that point it simply started tying the main script up when the current function ended(all this code is inside a function).
I don't know if you'll find a way around that -- pclose( $child ) cleans up after the child process (and PHP automatically does something similar when variables go out of scope). That means that when the process closes, either by explicitly calling pclose() or by the handle going out of scope, PHP must wait for it to finish what it's doing. If that's undesirable, you'll need to move your fork outside of your function, and use the function to pass information to the already existing child process.
Image

Templar GrandMaster
Posts: 706
Joined: Fri Dec 08, 2006 7:06 am
Location: Behind my computer.
Contact:

Re: Linux

#51 Post by RobbieThe1st » Thu Mar 26, 2009 7:21 am

avwolf wrote:
RobbieThe1st wrote:BTW:
Out of curiosity, I used the above code without the screen in the command-line: It worked, but still is tying up main-script time.
I found that it doesn't actually tie up my main script until the 'pclose($child);'. I did try removing the pclose() line, however at that point it simply started tying the main script up when the current function ended(all this code is inside a function).
I don't know if you'll find a way around that -- pclose( $child ) cleans up after the child process (and PHP automatically does something similar when variables go out of scope). That means that when the process closes, either by explicitly calling pclose() or by the handle going out of scope, PHP must wait for it to finish what it's doing. If that's undesirable, you'll need to move your fork outside of your function, and use the function to pass information to the already existing child process.
That's kind of what I figured... It was why I was trying to use screen to do just that.

User avatar
Consistently Inconsistent
Posts: 1725
Joined: Wed Jul 30, 2008 10:13 am

Re: Linux

#52 Post by aj » Thu Mar 26, 2009 1:09 pm

RobbieThe1st wrote:
What happens if you open the screen with the first command, send your information, then send the command sequence to detach the screen? (Usually Ctrl-A, D.) Not too sure if it'll work - don't have any experience with passing escape sequences through a pipe honestly.
I don't know what I am doing wrong, but if I just open it with 'screen php ./script.php' it gives me a 'Must be connected to a terminal.' error:
o.O Mine just doesn't run.

Fairly literally copied and pasted your code and it seems to run fine, just that the child script doesn't execute:
index.php:

Code: Select all

<?php
         $child = popen('screen php ./index_child.php','w');
         echo 'opened link '.(microtime(true) - $start)."\n";
         $sendstr = "testing\n";
         echo 'string to be sent '.(microtime(true) - $start).': '.$sendstr;
         fwrite($child,$sendstr);
         echo 'sent d1 '.(microtime(true) - $start)."\n";
         pclose($child);
         echo 'closed link '.(microtime(true) - $start)."\n";
?>
index_child.php:

Code: Select all

<?php
$myFile = "./testFile.txt";
$fh = fopen($myFile, 'w');
fwrite($fh, date('h:i:s') . "\n");
sleep(3);
fwrite($fh, date('h:i:s') . "\n");
fclose($fh);
?>
Result of index.php:

Code: Select all

opened link 1238070737.7959 string to be sent1238070737.7966: testing sent d1 1238070737.7968 closed link 1238070737.8071 
testFile.txt: Problem is nothing appears in the file "testFile.txt". Directory's correct, file permissions work, file is readable, etc. For some reason the script just doesn't write to the file, and I don't have time to investigate it, unfortunately. I'd suggest trying the cludgy way - write it to a file, then read the file back just to see if it works at this point.

Works, but cludgy is better IMO than code is as stylish as possible, but not functional.
Mind you I'm biased - my sole major PHP program that I've personally done has cURL getting a webpage, storing the webpage to a file, then php goes through and reads the file, substrings the appropriate 11 characters out, and uses it. Not pretty, but it works.
avwolf wrote:"No dating dog-girls, young man, your father is terribly allergic!"
y̸̶o͏͏ų̕ sh̡o̸̵u̶̕l̴d̵̡n̵͠'̵́͠t͜͢ ̀͜͝h̶̡àv̸e͡ ̛d̷̨͡o͏̀ne ̶͠͡t҉́h̕a̧͞t̨҉́.̵̧͞.͠͞.͟

Traveler
Posts: 18
Joined: Fri Aug 22, 2008 1:24 pm

Re: Linux

#53 Post by icewind » Thu Mar 26, 2009 8:08 pm

aj wrote:
RobbieThe1st wrote:
What happens if you open the screen with the first command, send your information, then send the command sequence to detach the screen? (Usually Ctrl-A, D.) Not too sure if it'll work - don't have any experience with passing escape sequences through a pipe honestly.
I don't know what I am doing wrong, but if I just open it with 'screen php ./script.php' it gives me a 'Must be connected to a terminal.' error:
o.O Mine just doesn't run.

Fairly literally copied and pasted your code and it seems to run fine, just that the child script doesn't execute:
index.php:

Code: Select all

<?php
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;$child&nbsp;=&nbsp;popen('screen&nbsp;php&nbsp;./index_child.php','w');
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;echo&nbsp;'opened&nbsp;link&nbsp;'.(microtime(true)&nbsp;-&nbsp;$start)."\n";
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;$sendstr&nbsp;=&nbsp;"testing\n";
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;echo&nbsp;'string&nbsp;to&nbsp;be&nbsp;sent&nbsp;'.(microtime(true)&nbsp;-&nbsp;$start).':&nbsp;'.$sendstr;
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;fwrite($child,$sendstr);
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;echo&nbsp;'sent&nbsp;d1&nbsp;'.(microtime(true)&nbsp;-&nbsp;$start)."\n";
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;pclose($child);
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;echo&nbsp;'closed&nbsp;link&nbsp;'.(microtime(true)&nbsp;-&nbsp;$start)."\n";
?>
index_child.php:

Code: Select all

<?php
$myFile&nbsp;=&nbsp;"./testFile.txt";
$fh&nbsp;=&nbsp;fopen($myFile,&nbsp;'w');
fwrite($fh,&nbsp;date('h:i:s')&nbsp;.&nbsp;"\n");
sleep(3);
fwrite($fh,&nbsp;date('h:i:s')&nbsp;.&nbsp;"\n");
fclose($fh);
?>
Result of index.php:

Code: Select all

opened link 1238070737.7959 string to be sent1238070737.7966: testing sent d1 1238070737.7968 closed link 1238070737.8071 
testFile.txt: Problem is nothing appears in the file "testFile.txt". Directory's correct, file permissions work, file is readable, etc. For some reason the script just doesn't write to the file, and I don't have time to investigate it, unfortunately. I'd suggest trying the cludgy way - write it to a file, then read the file back just to see if it works at this point.

Works, but cludgy is better IMO than code is as stylish as possible, but not functional.
Mind you I'm biased - my sole major PHP program that I've personally done has cURL getting a webpage, storing the webpage to a file, then php goes through and reads the file, substrings the appropriate 11 characters out, and uses it. Not pretty, but it works.
Not sure but I think popen just opens the file the way you would do from the terminal and not with the php intepreter so your problem would be that the file is opened but it doesn't know how to run it.
Try adding this to the top of your file

Code: Select all

#!/usr/bin/php
This allows you to run the code like you would run any other script.

Templar GrandMaster
Posts: 706
Joined: Fri Dec 08, 2006 7:06 am
Location: Behind my computer.
Contact:

Re: Linux

#54 Post by RobbieThe1st » Fri Mar 27, 2009 6:26 am

Not sure but I think popen just opens the file the way you would do from the terminal and not with the php intepreter so your problem would be that the file is opened but it doesn't know how to run it.
Try adding this to the top of your file

Code: Select all

#!/usr/bin/php
This allows you to run the code like you would run any other script.
Good suggestion, but I don't think that this is the problem - thats why our command is "screen php ./script.php", invoking the php interpreter.

AJ, I just tested again and found the same thing you did. "screen php ./script.php" will give that no-terminal error and will not run the script at all. However: "screen -dm php ./script.php" *WILL* run it, but I can't pass anything to it obviously.


What I ended up doing(and this is by no means the best way of doing it), is sticking the data as an arguement in the command-string.
I ended up having to rawurlencode it, as multiple types of characters used in serialize() get interpreted by the shell otherwise.

Code: Select all

$sendstr = rawurlencode(serialize($input).chr(30).serialize($names['known']).chr(30).serialize($names['unknown']));
$child = popen('screen -dm php ./snapshot_child.php '.$sendstr,'w');
pclose($child);

Now, don't get me wrong - I still would love to find a way to do it with pipes, but at least I have a -working- solution now.


-RobbieThe1st

User avatar
Consistently Inconsistent
Posts: 1725
Joined: Wed Jul 30, 2008 10:13 am

Re: Linux

#55 Post by aj » Fri Mar 27, 2009 1:32 pm

RobbieThe1st wrote:AJ, I just tested again and found the same thing you did. "screen php ./script.php" will give that no-terminal error and will not run the script at all. However: "screen -dm php ./script.php" *WILL* run it, but I can't pass anything to it obviously.


What I ended up doing(and this is by no means the best way of doing it), is sticking the data as an arguement in the command-string.
I ended up having to rawurlencode it, as multiple types of characters used in serialize() get interpreted by the shell otherwise.

Code: Select all

$sendstr = rawurlencode(serialize($input).chr(30).serialize($names['known']).chr(30).serialize($names['unknown']));
$child = popen('screen -dm php ./snapshot_child.php '.$sendstr,'w');
pclose($child);

Now, don't get me wrong - I still would love to find a way to do it with pipes, but at least I have a -working- solution now.
Well, at least it works.

And now I have something to read up on. :grin:
I knew pipes existed before, but just as some mystical thing.

As for the screen thing, I just have no idea why it doesn't like running php ./script.php. I'm going to try screen -dm when I have some time to see if that works or not. Would be strange if it does. :?
avwolf wrote:"No dating dog-girls, young man, your father is terribly allergic!"
y̸̶o͏͏ų̕ sh̡o̸̵u̶̕l̴d̵̡n̵͠'̵́͠t͜͢ ̀͜͝h̶̡àv̸e͡ ̛d̷̨͡o͏̀ne ̶͠͡t҉́h̕a̧͞t̨҉́.̵̧͞.͠͞.͟

Traveler
Posts: 19
Joined: Mon May 25, 2009 9:09 pm

Re: Linux

#56 Post by techdude300 » Fri May 29, 2009 10:34 pm

Alright, this is a bit of a long post so....

I've tried Ubuntu and really liked it. The problem is, I've got several things that will only run in Windows (Wine won't work with them). The last thing I want to do is wipe my drive and start over. I just want to move my existing XP partition inside a virtual machine of Ubuntu. I tried VMWare converter, but i don't have the disk space to make a copy of my XP partition. Is there any way at all I can do this?

Apprentice
Posts: 113
Joined: Wed Jan 07, 2009 6:54 pm
Contact:

Re: Linux

#57 Post by etam » Sat May 30, 2009 10:49 am

On Your situation I won't try to move existing XP to virtual machine. I'll just back-up all important data do external disk, install ubuntu and then install XP on virtual machine.

For personal use I recommend VirtualBox.

Templar GrandMaster
Posts: 706
Joined: Fri Dec 08, 2006 7:06 am
Location: Behind my computer.
Contact:

Re: Linux

#58 Post by RobbieThe1st » Tue Oct 13, 2009 7:31 am

I have a few issues. I have been using Ubuntu 9.04 on my laptop for a few months now, with Gnome, and decided to try KDE for the heck of it. It worked great, and I liked it.

Now, just recently, I saw the Ubuntu 4.10 beta, and upgraded over the web. After the install, I restarted, logged in, and saw This. All the buttons still work, but its a bit hard to see what you are doing when the various bars look like that.

A little googling later, I couldn't find much, but decided to try an alternate window manager to see if that helped. I tried Compiz(which was already installed I found out), I run "compiz --reload", and it worked! I now see This.

Which brings me to my problems:
1. Kwin has those screwed-up graphics. What's going on?
2. Minor, but the temperature-label on that temperature widget is shifted and has that black square next to it. this happened when I upgraded to 9.10; probably related to #1.
3. Compiz works, but its using the window borders and such from Gnome. How do I configure it(preferably with a GUI) in KDE and how can I get some of the Kwin themes like Oxygen?
4. Where's my shut-down button? I think this happened the same time as everything else, but where did it go? There's only log out and switch user there in image #2.

5. Unrelated, but occasionally my computer crashes. The screen shuts off and the numlock/capslock(I forget which) light blinks constantly until I turn it off. What does this mean, and how can I try to figure out this problem?


Thanks, all.

User avatar
Apprentice
Posts: 102
Joined: Sat Jan 24, 2009 8:29 pm
Location: England

Re: Linux

#59 Post by Symphona » Tue Mar 23, 2010 9:45 pm

Good evening all,

I seem to be having a bit of a problem with my drives at the moment; as my laptop stands (Asus EEE 1005HA for the record) I have two drives, 77Gb each. I'm fairly convinced they are actually two separate drives, but I can't confirm that...anyway, Windows takes up the entirety of the first drive, and Ubuntu my second. (the latest Netbook remix)

My problem is, if I partiton my linux drive to anything less than the full 77Gb, it leaves the rest of the space as 'unusable', which is a bit of a pain; for one thing a swap file would be lovely, and I'm considering having a play with a few other linux distros with the space I have.

I remember my desktop working fine when I added linux to it, it allowed me to re-partition and not bother about anything already on the drive. Am I just being thick?

Many thanks, Symphona
Linux is hard...

Templar GrandMaster
Posts: 706
Joined: Fri Dec 08, 2006 7:06 am
Location: Behind my computer.
Contact:

Re: Linux

#60 Post by RobbieThe1st » Wed Mar 24, 2010 3:35 am

What I suggest is imaging your partitions, back em up.
Then go start a LiveCD/LiveUSB disk, install/run Gparted, delete everything and repartition from scratch.

Once you do that, you can see about restoring your partitions selectively, perhaps resizing them as you go.

---
I'm pretty sure your laptop doesn't have two physical harddisk; 77GB is an odd number, and its too small a laptop to have two - Its far more likely its one, with some odd partitioning scheme on it.

Post Reply